Comsol -leaderboard other pages

Topics

Richard Roberts 1940–2020

Richard Roberts

When theorist Richard “Dick” Roberts began his career in the 1960s, the strong force was largely mysterious. Today, with the advent of quantum chromodynamics (QCD), we understand the detailed quark and gluon sub-structure of protons and even atomic nuclei. This development is due in no small part to the work that Roberts performed with his collaborators Alan Martin, James Stirling and, latterly, Robert Thorne.

The eponymous MRS and MRST collaborations analysed inelastic data on hadrons for more than three decades, extracting with ever higher precision the structure functions and thereby the momentum distributions of quarks and gluons in the proton. The MRS(T) distribution functions became a staple of particle physics and key to much of the planning for experiments at the LHC, and the subsequent analyses that led to the discovery of the Higgs boson.

Dick Roberts was born in North Wales, UK in 1940. He studied mathematics at King’s College, London, and won the Drew medal for achieving the highest mathematics degree in the whole of the University of London. He went on to complete a PhD at Imperial College, followed by research at Durham, CERN and UC San Diego, and then, in 1971, the Rutherford High Energy Laboratory (today the Rutherford Appleton Laboratory) near Oxford, where he remained until his retirement in 2000.

Throughout his career, he specialised in the theory and phenomenology of the strong interaction. The 1960s were the days of Regge theory and, while at CERN, he started working on the related Veneziano model and ideas about duality, which he subsequently developed, mainly with Hong-Mo Chan and D P Roy. Towards the end of the 1970s, with the discovery of the J/ψ, he became one of the first to apply the then novel ideas of QCD to the analysis of structure functions. With increasing precision, the MRS team extracted the parton distributions, which soon became a standard tool for experimental analysis and interpretation of data. He also made important contributions to understanding the EMC effect – where the distributions of quarks in atomic nuclei are subtly evolved in momentum space relative to what is found for quarks in free nucleons – and to the proton spin puzzle of the 1980s. His pedagogic understanding of QCD was to shine in his 1990 textbook Structure of the Proton (Cambridge University Press).

During the 1990s Dick was quick to develop the phenomenological implications of supersymmetric grand-unified theories that might be tested by the LHC. He also tackled the mystery of the origin of quark mass structure in work that has stimulated much of the ongoing activity in this area.

His retirement from research after 2000 soon led to another career, which revealed his talent for teaching. For the past 15 years he tutored first-year students at Oxford University’s Exeter College, and continued teaching until the university was closed by the COVID-19 pandemic in March 2020.

Quiet, unassuming but extremely effective, he was the powerhouse behind the scenes in many of his collaborations. Dick loved opera, piano playing, poetry, teaching, reading, sport, gardening and physics. He had a spark of good humour, a gentleness of spirit, and a warmth without parallel.

The Mirror Trap

The Mirror Trap

A quantum physicist has mysteriously disappeared, leaving behind two mirrors, a strange machine, hallucinogenic drugs and a diary filled with ramblings and Feynman diagrams. His last thoughts reveal his views on the many-worlds interpretation – the controversial idea that there are as many worlds as there are possible outcomes in quantum measurements.

The Mirror Trap is an online performance where the audience has the chance to experiment with the psychology of self-identity and explore the interpretations of quantum mechanics. The public is asked to draw Feynman diagrams on a mirror, plonk themselves down in front of it and listen to the play using headphones, thereby transforming a dimly lit room into a private theatrical space.

The experience is hypnotic, eerie and introspective. Ideas at the intersection between physics and psychology are described in a beautifully written monologue. The protagonist believes that he has devised a new way to access a parallel universe and replicate Schrödinger’s thought experiment; however, he must play the role of the cat, and be observed. Under severe emotional pressure, he begs the audience to witness his desperate attempt to reach a universe where he did not make the biggest mistake of his life.

Visual and auditory illusions play tricks with the participants’ brains

While the physicist is digging deep into his psyche and preparing for a leap into the unknown, visual and auditory illusions play tricks with the participants’ brains. From Snow White to Alice Through the Looking-Glass, mirrors have been linked to mysterious portals, superstition and fairy tales. In this play, they are portals to other worlds, and also tools to reflect about life, self and perception. Many people feel subjective sensations of otherness and report dissociative identity effects when looking at themselves in a mirror. This strange-face-in-the-mirror illusion is more pronounced in dim light and is associated with Troxler’s fading and neural adaptation: when we look at an unchanging image some features disappear temporarily from our perception and our brain fills this missing information with other elements. This effect is particularly spooky when applied to one’s own face.

The performance was written, created and played by biologist and science communicator Simon Watt, with assistance from playwright Alexandra Wood. The 20-minute piece was followed by a discussion and question-and-answer session with Watt, psychologist Julia Shaw, and physicist Harry Cliff of LHCb and the University of Cambridge, who was scientific consultant for this work and guest physicist at the Bloomsbury Festival, under the auspices of which the piece was performed. Watt is now looking for other researchers and festivals interested in collaborating.

As arts and science festivals have moved online because of Covid-19 restrictions, this show found a creative way to engage the public while sitting at home. A well-thought-out merging of drama and science engagement, The Mirror Trap is an intense and intriguing experience for physicists and non-physicists alike.

To Russia with love

“Why do you give all those secrets to the Russians?” So teases an inebriated Mary Bunemann, confidante to the leading nuclear physicists at the UK’s Atomic Energy Research Establishment, at the emotional climax of Frank Close’s new book Trinity: The Treachery and Pursuit of the Most Dangerous Spy in History. The scene is a party on New Year’s Eve in 1949, in the cloistered laboratory at Harwell, in the Berkshire countryside. With her voice audible across a room populated by his close colleagues and friends, Bunemann unwittingly confronted theoretical physicist Klaus Fuchs with the truth of his double life. As Close’s text suspensefully unfolds, the biggest brain working on Britain’s effort to build a nuclear arsenal had been faced with the very same allegation by an MI5 interrogator just 10 days earlier.

Close’s story expands dramatically in scope when Peierls and Fuchs are recruited to the Manhattan Project

Klaus Fuchs began working on nuclear weapons in 1941, when he was recruited by Rudolf Peierls – the “midwife to the atomic age”, in Close’s estimation. Both men were refugees from Nazi Germany. A few years older, and better established in Britain, Peierls would become a friend and mentor to Fuchs. A quarter of a century later, Peierls would also establish a relationship with a young Frank Close, when he arrived at Oxford’s theoretical physics department. Close has now been able to make a poignant contribution to the literature of the bomb by sharing the witness of his connection to the Peierls family, who felt Fuchs’ betrayal bitterly, and were personally affected by the suspicion engendered by his espionage.

Close’s story expands dramatically in scope when Peierls and Fuchs are recruited to the Manhattan Project. Though Peierls was among the first to glimpse the power of atomic weapons, Fuchs began to exceed him in significance to the project during this period. In one of the strongest portions of the book, Close balances physics, politics and the intrigue of shady meetings with Fuchs’ handlers at a time when he passed to the Soviet Union a complete set of instructions for building the first stage of a uranium bomb, a full description of the plutonium bomb used in the Trinity test in the New Mexico desert, and detailed notes on Enrico Fermi’s lectures on the hydrogen bomb.

Intensely claustrophobic

The story becomes intensely claustrophobic when Fuchs returns to England to head the theoretical physics department at Harwell. Here, Close evokes the contradictions in Fuchs’ character: his conviction that nuclear knowledge should be shared between great powers to avert war; his principled but tested faith in communism, awakened while protesting the rise of Nazism; his devoted pastoral care for members of his inner circle at Harwell, even as the net closed around him; and his willingness to share not only nuclear secrets but also the bed of his colleague’s wife. Close has a particular obsession with the question of whether Fuchs’ eventual confession was induced by unrealistic suggestions that he could be forgiven and continue his work. But inducement did not jeopardise Fuchs’ ultimate conviction and imprisonment, despite MI5’s fears, and Close judges his 14-year sentence, later reduced, to be just. Even here, however, the Soviets had the last laugh, with Fuchs’ apprehension not only depriving the British nuclear programme of its greatest intellectual asset, but also precipitating the defection of Bruno Pontecorvo.

Trinity book cover

Close chose an ideal moment to research his history, writing with the benefit of newly released MI5 records, and before several others were withdrawn without notice. He applies forensic attention to the agency’s pursuit of the nuclear spy. Occasionally, however, this is to the detriment of the reader, with events seemingly diffracted onto the pages – both prefigured and returned to as the story progresses and new evidence comes to light. We step through time in Fuchs’ shoes, for example only learning at the end of the book that two other spies at the Manhattan Project were also passing information to the Russians. While Close’s inclination to let the evidence speak for itself is surely the mark of a good physicist, readers in search of a more analytical history may wish to also consult Mike Rossiter’s 2014 biography The Spy Who Changed the World: Klaus Fuchs and the secrets of the nuclear bomb, which offers a more rounded presentation of the Russian and American perspectives.

By bringing physics expertise, personal connections and impressive attention to detail to bear, Frank Close’s latest book has much to offer readers seeking insights into a formative time for the field, when the most talented minds in nuclear physics also bore the weight of world politics on their shoulders. He eloquently tells the tragedy of “the most dangerous spy in history”, as it played out between the trinity of Fuchs, his mentor Peierls and a shadowy network of spooks. Above all, the text is an intimate portrait of the inner struggles of a principled man who betrayed his adopted homeland, even as he grew to love it, and by doing so helped to shape the latter half of the 20th century.

Nuclear win for ISOLDE physicists

2020 Lise Meitner winners

The nuclear physics division of the European Physical Society today awarded the 2020 Lise Meitner Prize to three physicists who have played a decisive role in turning a small-scale nuclear-physics experiment at CERN into a world-leading facility for the investigation of nuclear structure.

Klaus Blaum (Max Planck Institute for Nuclear Physics), Björn Jonson (Chalmers University of Technology) and Piet Van Duppen (KU Leuven) are recognised for the development and application of online instrumentation and techniques, and for the precise and systematic investigation of properties of nuclei far from stability at CERN’s Isotope mass Separator On-Line facility (ISOLDE).

Blaum has made key contributions to the high-precision determination of nuclear ground state properties with laser and mass spectroscopic methods and to the development of new techniques in this field, while Jonson was acknowledged for his studies of the lightest exotic nuclei, namely halo nuclei, where he was the first to explain its surprisingly large matter radius. Van Duppen was recognised for his push in the production and investigation of post-accelerated radioactive beams with REX-ISOLDE. Since the 1960s, the ISOLDE user facility has produced extreme nuclear systems to help physicists understand how the strong interaction binds the ingredients of atomic nuclei, with advanced traps and lasers recently offering new ways to look for physics beyond the Standard Model.

I’m very impressed by the breadth of the recent prize winners

Eckhard Elsen

The biennial Lise Meitner prize, named after one of the pioneers in the discovery of nuclear fission in 1939, was established in 2000 to acknowledge outstanding work in the fields of experimental, theoretical or applied nuclear science. Former winners include a quartet of physicists (Johanna Stachel, Peter Braun-Munzinger, Paolo Giubellino and Jürgen Schukraft) from the ALICE collaboration in 2014, for the experimental exploration of the quark-gluon plasma using ultra-relativistic nucleus-nucleus collisions, and for the design and construction of the ALICE detector.

This year’s awards were officially presented during the 2020 ISOLDE workshop and users meeting held online on 26-27 November. “I’m very impressed by the breadth of the recent prize winners….covering a range of topics and varying between individuals and teams,” said CERN director for research and computing Eckhard Elsen during the award ceremony. “It is a good indicator of the health and the push of the field – it is truly alive.

Beating cardiac arrhythmia

EBAMed’s technical team

In December last year, a beam of protons was used to treat a patient with cardiac arrhythmia – an irregular beating of the heart that affects around 15 million people in Europe and North America alone. The successful procedure, performed at the National Center of Oncological Hadrontherapy (CNAO) in Italy, signalled a new application of proton therapy, which has been used to treat upwards of 170,000 cancer patients worldwide since the early 1990s.

In parallel to CNAO – which is based on accelerator technologies developed in conjunction with CERN via the TERA Foundation – a Geneva-based start-up called EBAMed (External Beam Ablation) founded by CERN alumnus Adriano Garonna aims to develop and commercialise image-guidance solutions for non-invasive treatments of heart arrhythmias. EBAMed’s technology is centred on an ultrasound imaging system that monitors a patient’s heart activity, interprets the motion in real time and sends a signal to the proton-therapy machine when the radiation should be sent. Once targeted, the proton beam ablates specific heart tissues to stop the local conduction of disrupted electrical signals.

Fast learner

“Our challenge was to find a solution using the precision of proton therapy on a fast and irregular moving target: the heart,” explains Garonna. “The device senses motion at a very fast rate, and we use machine learning to interpret the images in real time, which allows robust decision-making.” Unlike current treatments, which can be lengthy and costly, he adds, people can be treated as outpatients; the intervention is non-invasive and “completely pain-free”.

The recipient of several awards – including TOP 100 Swiss Startups 2019, Venture Business Plan 2018, MassChallenge 2018, Venture Kick 2018 and IMD 2017 Start-up Competition – EBAMed recently received a €2.4 million grant from the European Union to fund product development and the first human tests.

Garonna’s professional journey began when he was a summer student at CERN in 2007, working on user-interface software for a new optical position-monitoring system at LHC Point 5 (CMS). Following his graduation, Garonna returned to CERN as a PhD student with the TERA Foundation and École Polytechnique Fédérale de Lausanne, and then as a fellow working for the Marie Curie programme PARTNER, a training network for European radiotherapy. This led to a position as head of therapy accelerator commissioning at MedAustron in Austria – a facility for proton and ion therapy based, like CNAO, on TERA Foundation/CERN technology. After helping deliver the first patient treatments at MedAustron, Garonna returned to CERN and entered informal discussions with TERA founder Ugo Amaldi, who was one of Garonna’s PhD supervisors, about how to take the technology further. Along with former CERN engineer Giovanni Leo and arrhythmia expert Douglas Packer, the group founded EBAMed in 2018.

“Becoming an entrepreneur was not my initial purpose, but I was fascinated by the project and convinced that a start-up was the best vehicle to bring it to market,” says Garonna. Not having a business background, he benefitted from the CERN Knowledge Transfer entrepreneurship seminars as well as the support from the Geneva incubator Fongit and courses organised by Innosuisse, the Swiss innovation agency. Garonna also drew on previous experience gained while at CERN. “At CERN most of my projects involved exploring new areas. While I benefitted from the support of my supervisors, I had to drive projects on my own, seek the right solutions and build the appropriate ecosystem to obtain results. This certainly developed an initiative-driven, entrepreneurial streak in me.”

Healthy competition

Proton therapy is booming, with almost 100 facilities operating worldwide and more than 35 under construction. EBAMed’s equipment can be installed in any proton-therapy centre irrespective of its technology, says Garonna. “We already have prospective patients contacting us as they have heard of our device and wish to benefit from the treatment. As a company, we want to be the leaders in our field. We do have a US competitor, who has developed a planning system using conventional radiotherapy, and we are grateful that there is another player on the market as it helps pave the way to non-invasive treatments. Additionally, it is dangerous to be alone, as that could imply that there is no market in the first place.”

Leaving the security of a job to risk it all with a start-up is a gradual process, says Garonna. “It’s definitely challenging to jump into what seems like cold water… you have to think if it is worth the journey. If you believe in what you are doing, I think it will be worth it.”

In pursuit of the possible

Giulia Zanderighi

What do collider phenomenologists do?

I tend to prefer the term particle phenomenology because the collider is just the tool that we use. However, compared to other experiments, such as those searching for dark matter or axions, colliders provide a controlled laboratory where you decide how many collisions and what energy these collisions should have. This is quite unique. Today, accelerators and detectors have reached an immense level of sophistication, and this allows us to perform a vast amount of fundamental measurements. So, the field spans precision measurements of fundamental properties of particles, in particular of the Higgs boson, consistency tests of the Standard Model (SM), direct and indirect searches for new physics, measurements of rare decays, and much more. For essentially all these topics we have had new results in recent years, so it’s a very active and continuously evolving field. But of course we do not just measure things for the sake of it. We have big, fundamental questions and we are looking for hints from LHC data as to how to address them.

Whats hot in the field today?

One topic that I think is very cool is that we can benefit from the LHC, in its current setup, also as lepton collider. In fact, at the LHC we are looking at elementary collisions between the proton’s constituents, quarks and gluons. But since the proton is charged, it also emits photons, and one can talk about the photon parton-distribution function (PDF), i.e. the photonic content of protons. These photons can split into lepton pairs, so when one collides protons one is also colliding leptons. The fascinating thing is that the “content” of leptons in protons is rather democratic, so one can look at collisions between, say, a muon and a tau lepton – something that can’t be done even at future proposed lepton colliders. Furthermore, by picking up a lepton from one proton and a quark from the other proton, one can place new constraints on leptoquarks, and plenty of other things. This idea was already proposed in the 1990s, but was essentially forgotten because the lepton PDF was not known. Now we know this very precisely, bringing new possibilities. But let me stress that this is just one idea – there are many other new ideas out there. For instance, one major branch of phenomenology is to use machine learning or deep learning to recognise the SM and extract from data what is not SM-like.

I’m the first female director, which of course is a great responsibility

How does the Max Planck Institute differ from your previous positions, for example at CERN and Oxford?

A long time ago, somebody told me that the best thing that can happen to you in Germany is the Max Planck Society. It’s true. You are given independence and the means to fully focus on research and ideas, largely free of teaching duties or the need to apply for grants. Also, there are very valuable interactions with universities, be it in research or via the International Max Planck Research Schools for PhD students. Our institute in Munich is a very unique place. One can feel it immediately. As a guest in the theory department, for example, you get to sit in the Heisenberg office, which feels like going back in time. Our institute was founded in Berlin in 1917 with Albert Einstein as a first director. In 1958 the institute moved to Munich with Werner Heisenberg as director. After more than 100 years I’m the first female director, which of course is a great responsibility. But I also really loved both CERN and Oxford. At CERN I felt like I was at the centre of the world. It is such a vibrant environment, and I loved the proximity to the experiments and the chats in the cafeteria about calculations or measurements. In Oxford I loved the multidisciplinary aspect, the dinners in college sitting next to other academics working in completely different fields. I guess I’m lucky that I’ve been in so many and such different places.

What is the biggest challenge to reach higher precision in quantum-field-theory calculations of key SM processes?

Scattering processes

The biggest challenge is that often there is no single biggest challenge. For instance, for inclusive Higgs-boson production we have a number of theoretical uncertainties, but they are all quite comparable in size. This means that to reduce the overall uncertainty considerably, one needs to reduce all uncertainties, and they all have very different physics origins and difficulties – from a better understanding of the incoming parton densities and a better knowledge of the strong coupling constant, to higher order QCD or electroweak effects and effects related to heavy particles in virtual loops, etc. Computing power can be a liming factor for certain calculations, so making things numerically more efficient is also important. One of the main goals of the coming year will be the calculation of two to three scattering processes at the LHC at next-to-next-to-leading order (NNLO) in QCD. For instance, a milestone will be the calculation of top-pair production in association with a Higgs boson at that level of accuracy. This is the process where we can measure most directly the top-Yukawa coupling. The importance of this measurement can’t be overstressed. While the big discovery at the LHC is so far the Higgs boson, one should also remember that the Yukawa interaction is a new type of fundamental interaction, which is proportional to the mass of the particle, just like gravity, but yet so different from gravity. For some calculations, NNLO is already enough in terms of perturbative precision; going to N3LO doesn’t really add much yet. But there are a few cases where it helps already, such as super-clean Drell–Yan processes.

Is there a level of precision of LHC measurements beyond which indirect searches for new physics are no longer fruitful?

We will never rule out precision measurements as a route to search for new physics. We can always extend the reach and enhance the sensitivity of indirect searches. By increasing precision, we are exploring deeper in the ultraviolet region, where we can start to become sensitive to states exchanged in the loops that are more and more heavy. There is a limit, but we are very far from it. It’s like looking with a better and better microscope: the better the resolution, the more one can explore. However, the experimental precision has to go hand in hand with theoretical precision, and this is where the real challenge for phenomenologists lies. Of course, if you have a super precise measurement but no theory prediction, or vice versa, then you can’t do much with it. With the Higgs boson I am confident that the theory calculations will not be the deal breaker. We will eventually hit the wall in terms of experimental precision, but you can’t put a figure on where this will happen. Until you see a deviation you are never really done.

How would you characterise the state of particle physics today?

When I entered the field as a student, there were high expectations that supersymmetry would be discovered quickly at the LHC. Now things have turned out to be different, but this is what makes it exciting and challenging – even more so, because the same mysteries are still there. We have big, fundamental questions. We have hints from theory, from experiments. We have a powerful, multi-purpose machine – the LHC – that is only just getting started and will provide much more data. Of course, expectations like the quick discovery of supersymmetry have not been fulfilled, but nature is how it is. I think that progress in physics is driven by experiments. We have beautiful exceptions where progress comes from theory, like general relativity, or the postulation of the mechanism for electroweak symmetry breaking. When I think about the Higgs mechanism, I am still astonished that such a simple and powerful idea postulated in 1964 turns out to be realised in nature. But these cases, where theory precedes experiments, are the exception not the rule. In most cases progress in physics comes from observations. After all, it is a natural science, it is not mathematics.

There are some questions that are really tough, and we may never really see an answer to. But with the LHC there are many other smaller questions we certainly can address, such as understanding the proton structure or studying the interaction potential between nucleons and strange baryons, which are relevant to understand the physics of neutron stars, and these are still advancing knowledge. The brightest minds are attracted to the biggest problems, and this will always draw young researchers into the field.

Is naturalness a good guiding force in fundamental research?

Yes. We have plenty of examples where naturalness, in the sense of a quadratic sensitivity to an unknown ultraviolet scale, leads to postulating a new particle: the energy of the electron field (leading to the positron), the charged and neutral pion mass difference (leading to the rho-meson) or the kaon transition rates and mixing, which led to the postulation of the existence of the charm quark in 1970, before its direct discovery in 1974 at SLAC and Brookhaven. In everyday life we constantly assume naturalness, so yes, it is puzzling that the Higgs mass appears to be fine-tuned. Certainly, there is a lot we still don’t understand here, but I would not give up on naturalness, at least not that easily. In the case of the electroweak naturalness problem, it is clear that any solution, such as supersymmetry or compositeness, will also leave an imprint in the Higgs couplings. So the LHC can, in principle, tell us about naturalness even if we do not discover new physics directly; we just have to measure very precisely if the Higgs boson couplings align on a straight line in the mass-versus-coupling plane.

The presence of dark matter is overwhelming in the universe and it is embarrassing that we know little to nothing about its nature

Which collider should follow the LHC?

That is the billion-dollar question – I mean, the 25 billion-dollar question! To me one should go for the machines that explore as much as possible the new energy frontier, namely a 100 TeV hadron collider. It is a compromise between what we might be able to achieve from a machine-building/accelerator/engineering point of view and really exploring a new frontier. For instance, at a 100 TeV machine one can measure the Higgs self-coupling, which is intimately connected with the Higgs potential and to the profound question of the stability of the vacuum.

Which open question would you most like to see answered during your career?

Probably the nature of dark matter. The presence of dark matter is overwhelming in the universe and it is embarrassing that we know little to nothing about its nature and properties. There are many exciting possibilities, ranging from the lightest neutral states in new-physics models to a non-particle-like interpretation, like black holes. Either way, an answer to this question would be an incredible breakthrough.

How to find a Higgs boson

How to Find a Higgs Boson

Finding Higgs bosons can seem esoteric to the uninitiated. The spouse of a colleague of mine has such trouble describing what their partner does that they read from a card in the event that they are questioned on the subject. Do you experience similar difficulties in describing what you do to loved ones? If so, then Ivo van Vulpen’s book How to find a Higgs boson may provide you with an ideal gift opportunity.

Readers will feel like they are talking physics over a drink with van Vulpen, who is a lecturer at the University of Amsterdam and a member of the ATLAS collaboration. Originally published as De melodie van de natuur, the book’s Dutch origins are unmistakable. We read about Hans Lippershey’s lenses, Antonie van Leeuwenhoeck’s microbiology, Antonius van den Broek’s association of charge with the number of electrons in an atom, and even Erik Verlinde’s theory of gravity as an emergent entropic force. Though the Higgs is dangled at the end of chapters as a carrot to get the reader to keep reading, van Vulpen’s text isn’t an airy pamphlet cashing in on the 2012 discovery, but a realistic representation of what it’s like to be a particle physicist. When he counsels budding scientists to equip themselves better than the North Pole explorer who sets out with a Hugo Boss suit, a cheese slicer and a bicycle, he tells us as much about himself as about what it’s like to be a physicist.

Van Vulpen is a truth teller who isn’t afraid to dent the romantic image of serene progress orchestrated by a parade of geniuses. 9999 out of every 10,000 predictions from “formula whisperers” (theorists) turn out to be complete hogwash, he writes, in the English translation by David McKay. Sociological realities such as “mixed CMS–ATLAS” couples temper the physics, which is unabashedly challenging and unvarnished. The book boasts a particularly lucid and intelligible description of particle detectors for the general reader, and has a nice focus on applications. Particle accelerators are discussed in relation to the “colour X-rays” of the Medipix project. Spin in the context of MRI. Radioactivity with reference to locating blocked arteries. Antimatter in the context of PET scans. Key ideas are brought to life in cartoons by Serena Oggero, formerly of the LHCb collaboration.

The weak interaction is like a dog on an attometre-long chain.

Attentive readers will occasionally be frustrated. For example, despite a stated aim of the book being to fight “formulaphobia”, Bohr’s famous recipe for energy levels lacks the crucial minus sign just a few lines before a listing of –3.6 eV (as opposed to –13.6 eV) for the energy of the ground state. Van Vulpen compares the beauty seen by physicists in equations to the beauty glimpsed by musicians as they read sheet music, but then prints Einstein’s field equations with half the tensor indices missing. But to quibble about typos in the English translation would be to miss the point of the book, which is to allow readers “to impress friends over a drink,” and talk physics “next time you’re in a bar”. Van Vulpen’s writing is always entertaining, but never condescending. Filled with amusing but perceptive one-liners, the book is perfectly calibrated for readers who don’t usually enjoy science. Life in a civilisation that evolved before supernovas would have no cutlery, he observes. Neutrinos are the David Bowie of particles. The weak interaction is like a dog on an attometre-long chain.

This book could be the perfect gift for a curious spouse. But beware: fielding questions on the excellent last chapter, which takes in supersymmetry, SO(10), and millimetre-scale extra dimensions, may require some revision.

Black holes attract 2020 Nobel Prize

Penrose, Ghez and Genzel

The 2020 Nobel Prize in Physics, announced on 6 October, has recognised seminal achievements in the theoretical and experimental understanding of black holes. One half of the SEK 10 million ($1.15 million) award went to Roger Penrose of the University of Oxford “for the discovery that black-hole formation is a robust prediction of the general theory of relativity”. The other half was awarded jointly to Andrea Ghez of the University of California, Los Angeles and Reinhard Genzel of the Max Planck Institute for Extraterrestrial Physics “for the discovery of a supermassive compact object at the centre of our galaxy”, after the pair led separate research teams during the 1990s to identify a black hole at the centre of the Milky Way.

You might ask where the greatest entropy is in the universe – by an absolutely enormous factor it is in black holes

Roger Penrose

As soon as Einstein had completed his theory of general relativity in 1915, it was clear that solutions in the vicinity of a spherically symmetric, non-rotating mass allow space–time to be “pinched” to a point, or singularity, where known physics ceases to apply. Few people, including Einstein himself, however, thought that black holes really exist. But 50 years later, Penrose invented a mathematical tool called a trapped surface to show that black holes are a natural consequence of general relativity, proving that they each hide a singularity. His groundbreaking article (Phys. Rev. Lett. 14 57) is heralded as the first post-Einsteinian result in general relativity.

Penrose is also known for the “Penrose process”, whereby a particle–antiparticle pair that forms close to the event horizon of a black hole can become separated, with one of the two particles falling into the black hole and the other one escaping and carrying away energy and angular momentum. He also proposed twistor theory, which has evolved into a rich branch of theoretical and mathematical physics with potential relevance to the unification of general relativity and quantum mechanics, among many other contributions.

“I really had to have a good idea of the space–time geometry. Not just 3D, you had to think of the whole 4D space–time… I do most of my thinking in visual terms, rather than writing down equations,” said Penrose in an interview with the Nobel Foundation following the award. “Black holes have become more and more important, also in ways that people don’t normally appreciate. They are the basis of the second law of thermodynamics… You might ask where the greatest entropy is in the universe – by an absolutely enormous factor it is in black holes.”

Preparatory ‘pre-lab’ proposed for ILC

ILC accelerating module

On 10 September the International Committee for Future Accelerators (ICFA) announced the structure and members of a new organisational team to prepare a “pre-laboratory” for an International Linear Collider (ILC) in Japan. The ILC International Development Team (ILC-IDT), which consists of an executive board and three working groups governing the pre-lab setup, accelerator, and physics and detectors, aims to complete the preparatory phase for the pre-lab on a timescale of around 1.5 years.

We hope that the effort by our Japanese colleagues will result in a positive move by the Japanese government

Tatsuya Nakada

The aim of the pre-lab is to prepare the ILC project, should it be approved, for construction. It is based on a memoranda of understanding among participating national and regional laboratories, rather than intergovernmental agreements, explains chair of the ILC-IDT executive board Tatsuya Nakada of École Polytechnique Fédérale de Lausanne. “The ILC-IDT is preparing a proposal for the organisational and operational framework of the pre-lab, which will have a central office in Japan hosted by the KEK laboratory,” says Nakada. “In parallel to our activities, we hope that the effort by our Japanese colleagues will result in a positive move by the Japanese government that is equally essential for establishing the pre-laboratory.”

In June the Linear Collider Board and Linear Collider Collaboration, which were established by ICFA in 2013 to promote the case for an electron–positron linear collider and its detectors as a worldwide collaborative project, reached the end of their terms in view of ICFA’s decision to set up the ILC-IDT.

The ILC has been on the table for almost two decades. Shortly after the discovery of the Higgs boson in 2012, the Japanese high-energy physics community proposed to host the estimated $7 billion project, with Japan’s prime minister at that time, Yoshihiko Noda, stressing the importance of establishing an international framework. In 2018 ICFA backed the ILC as a Higgs factory operating at a centre-of-mass energy of 250 GeV – half the energy set out five years earlier in the ILC’s technical design report.

Higgs factory

An electron–positron Higgs factory is the highest-priority next collider, concluded the 2020 update of the European strategy for particle physics (ESPPU). The ESPPU recommended that Europe, together with its international partners, explore the feasibility of a future hadron collider at CERN at the energy frontier with an electron–positron Higgs factory as a possible first stage, noting that the timely realisation of the ILC in Japan “would be compatible with this strategy”. Two further proposals exist: the Compact Linear Collider at CERN and the Circular Electron–Positron Collider in China. While the ILC is the most technically ready Higgs-factory proposal (see p35), physicists are still awaiting a concrete decision about its future.

In March 2019 Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) expressed “continued interest” in the ILC, but announced that it had “not yet reached declaration” for hosting the project, arguing that it required further discussion in formal academic decision-making processes. In February KEK submitted an application for the ILC project to be considered in the MEXT 2020 roadmap for large-scale research projects. KEK withdrew the application the following month, announcing the move in September following the establishment of the ILC-IDT.

The ministry will keep an eye on discussions by the international research community

Koichi Hagiuda

“The ministry will keep an eye on discussions by the international research community while exchanging opinions with government authorities in the US and Europe,” said Koichi Hagiuda, Japanese minister of education, culture, sports, science and technology, at a press conference on 11 September.

Steinar Stapnes of CERN, who is a member of the ILC-IDT executive board representing Europe, says that clear support from the Japanese government is needed for the ILC pre-lab. “The overall project size is much larger than the usual science projects being considered in these processes and it is difficult to see how it could be funded within the normal MEXT budget for large-scale science,” he says. “During the pre-lab phase, intergovernmental discussions and negotiation about the share of funding and responsibilities for the ILC construction need to take place and hopefully converge.”

CERN publishes first environmental report

Safety engineering and environment group members

“It is our vision for CERN to be a role model for environmentally responsible research,” writes CERN Director-General Fabiola Gianotti in her introduction to a landmark environmental report released by the laboratory on 9 September. While CERN has a longstanding framework in place for environmental protection, and has documented its environmental impact for decades, this is its first public report. Two years in the making, and prepared according to the Global Reporting Initiative Sustainability Reporting Standards, it details the status of CERN’s environmental footprint, along with objectives for the coming years.

Given the energy consumption of large particle accelerators, environmental impact is a topic of increasing importance for high-energy physics research worldwide. Among the recommendations of the 2020 update of the European strategy for particle physics was a strong emphasis on the need to continue with efforts to minimise the environmental impact of accelerator facilities and maximise the energy efficiency of future projects.

When the Large Hadron Collider (LHC) is operating, CERN uses an average of 4300 TJ of electricity every year (30–50% less when not in operation) – enough energy to power just under half of the 200,000 homes in the canton of Geneva. “This is an inescapable fact, and one that CERN has always taken into consideration when designing new facilities,” states Frédérick Bordry, director for accelerators and technology.

Action plan

An energy-management panel established at CERN in 2015 has already led to actions, including free cooling and air-flow optimisation, better optimised LHC cryogenics, and the implementation of SPS magnetic cycles and stand-by modes, which significantly reduce energy consumption. The LHC delivered twice as much data per Joule in its second run (2015–2018) compared to its first (2010–2013), states the new report. With the High-Luminosity LHC due to deliver a tenfold increase in luminosity towards the end of the decade, CERN has made it a priority to limit the increase in energy consumption to 5% up to the end of 2024, with longer-term objectives to be set in future reports.

CERN procures its electricity mainly from France, whose production capacity is 87.9% carbon-free. In terms of direct greenhouse-gas emissions, the 192,000 tonnes of carbon-dioxide equivalent emitted by CERN in 2018 is mainly due to fluorinated gases used in the LHC detectors for cooling, particle detection, air conditioning and electrical insulation. CERN has set a formal objective that, by 2024, direct greenhouse emissions will be reduced by 28% by replacing fluorinated gases – which were designed in the 1990s to be ozone-friendly – with carbon dioxide, which has a global-warming potential several thousand times lower.

CERN has set a formal objective that, by 2024, direct greenhouse emissions will be reduced
by 28%

Other areas of environmental significance studied in the report include radiation exposure, noise and waste. CERN commits to limit the emission of ionising radiation to no more than 0.3 mSv per year – less than a third of the annual dose limit for public exposure set by the European Council. The report states that the actual dose to any member of the public living in the immediate vicinity of CERN due to the laboratory’s activities is below 0.02 mSv per year, which is less than the exposure received from cosmic radiation during a transatlantic flight.

A 2018 measurement campaign showed that noise levels at CERN have not changed since the early 1990s, and are low by urban standards. Nevertheless, CERN has have invested 0.7 million CHF to reduce noise at its perimeters to below 70 dB during the day and 60 dB at night (which corresponds to the level of conversational speech). The organisation has also introduced approaches to preserve the local landscape and protect flora, including 15 species of orchid growing on CERN’s sites.

Waste not

Water consumption, mostly drawn from Lac Léman, has slowly decreased over the past 10 years, the report notes, and CERN commits to keeping the increase in water consumption below 5% to the end of 2024, despite a growing demand for cooling from upgraded facilities. CERN also eliminates 100% of its waste, states the report, and has a recycling rate of 56% for non-hazardous waste (which comprises 81% of the total). A major project under construction since last year will see waste hot water from the cooling system for LHC Point 8 (where the LHCb experiment is located) channeled to a heating network in the nearby town of Ferney-Voltaire from 2022, with LHC Points 2 and 5 being considered for similar projects.

CERN plans to release further environment reports every two years. “Today, more than ever, science’s flag-bearers need to demonstrate their relevance, their engagement, and their integration into society as a whole,” writes Gianotti. “This report underlines our strong commitment to environmental protection, both in terms of minimising our impact and applying CERN technologies for environmental protection.”

bright-rec iop pub iop-science physcis connect