Comsol -leaderboard other pages

Topics

Dipole marks path to future collider

Installation of the MDP

Researchers in the US have demonstrated an advanced accelerator dipole magnet with a field of 14.1 T – the highest ever achieved for such a device at an operational temperature of 4.5 K. The milestone is the work of the US Magnet Development Program (MDP), which includes Fermilab, Lawrence Berkeley National Laboratory (LBNL), the National High-Field Magnetic Field Laboratory and Brookhaven National Laboratory. The MDP’s “cos-theta 1” (MDPCT1) dipole, made from Nb3Sn superconductor, beats the 13.8 T at 4.5 K achieved by LBNL magnet “HD2” a decade ago, and follows the 14.6 T at 1.9 K (13.9 T at 4.5 K) reached by “FRESCA 2” at CERN in 2018, which was built as a superconducting-cable test station. Together with other recent advances in accelerator magnets in Europe and elsewhere, the result sends a positive signal for the feasibility of next-generation hadron colliders.

The MDP was established in 2016 by the US Department of Energy to develop magnets that operate as closely as possible to the fundamental limits of superconducting materials while minimising the need for magnet training. The programme aims to integrate domestic accelerator-magnet R&D and position the US in the technology development for future high-energy proton-proton colliders, including a possible 100 km-circumference facility at CERN under study by the Future Circular Collider (FCC) collaboration. In addition to the baseline design of MDPCT1, other design options for such a machine have been studied and will be tested in the coming years.

“The goal for this first magnet test was to limit the coil mechanical pre-load to a safe level, sufficient to produce a 14 T field in the magnet aperture,” explains MDPCT1 project leader Alexander Zlobin of Fermilab. “This goal was achieved after a short magnet training at 1.9 K: in the last quench at 4.5 K the magnet reached 14.1 T. Following this successful test the magnet pre-stress will be increased to reach its design limit of 15 T.”

The result sends a positive signal for the feasibility of next-generation hadron colliders

The development of high-field superconducting accelerator magnets has received a strong boost from high-energy physics in the past decades. The current state of the art is the LHC dipole magnets, which operate at 1.9 K to produce a field of around 8 T, enabling proton-proton collisions at an energy of 13 TeV. Exploring higher energies, up to 100 TeV at a possible future circular collider, requires higher magnetic fields to steer the more energetic beams. The goal is to double the field strength compared to the LHC dipole magnets, reaching up to 16 T, which calls for innovative magnet design and a different superconductor compared to the Nb-Ti used in the LHC. Currently, Nb3Sn (niobium tin) is being explored as a viable candidate for reaching this goal. High-temperature superconductors, such as REBCO, MgB2 and iron-based materials, are also being studied.

HL-LHC first

The first accelerator magnets to use Nb3Sn technology are the 11 T dipole magnets and the final-focusing magnets under development for the high luminosity LHC (HL-LHC), which will be installed around the interaction points. But the FCC would require more than 5000 superconducting dipoles grouped for powering in series and operating continuously over long time periods. A number of critical aspects underlie the design, cost-effective  manufacturing and reliable operation of 16 T dipole magnets in future colliders. Among the targets for the Nb3Sn conductor is a critical current density of 1500 A/mm2 at 16 T and 4.2 K – almost a 50% increase compared to the current state of the art. In addition to the conductor, developing an industry-adapted design for 16 T dipoles and other accelerator magnets with higher performance presents a major challenge.

Training quench history for the MDPCT1 demonstrator magnet

The FCC collaboration has launched a rigorous R&D programme towards 16T magnets. Key components are the global Nb3Sn conductor development programme, featuring a network of academic institutes and industrial partners, and the 16 T magnet-design work package supported by the EU-funded EuroCirCol project. This is now being followed by a 16 T short-model programme aiming at constructing model magnets with several partners worldwide such as the US MDP. Unit lengths of Nb3Sn wires with performance at least comparable to that of the HL-LHC conductor have already been produced by industry and cabled at CERN, while, at Fermilab, multi-filamentary wire produced with an internal oxidation process has already exceeded the critical current density target for the FCC – just two examples of many recent advances in this area. EuroCirCol, which officially wound up this year (see Study comes full EuroCirCol), has also enabled a design and cost model for the magnets of FCC, demonstrating the feasibility of Nb3Sn technology.

“The enthusiasm of the worldwide superconductor community and the achievements are impressive,” says Amalia Ballarino, leader of the conductor activity at CERN. “The FCC conductor development targets are very challenging. The demonstration of a 14 T field in a dipole accelerator magnet and the possibility of reaching the target critical current density in R&D wires are milestones in the history of Nb3Sn conductor and a reassuring achievement for the FCC magnet development programme.”

CERN and the Higgs Boson

CERN and the Higgs Boson, by James Gillies

James Gillies’ slim volume CERN and the Higgs Boson conveys the sheer excitement of the hunt for the eponymous particle. It is a hunt that had its origins at the beginning of the last century, with the discovery of the electron, quantum mechanics and relativity, and which was only completed in the first decades of the next. It is also a hunt throughout which CERN’s science, technology and culture grew in importance. Gillies has produced a lively and enthusiastic text that explores the historical, theoretical, experimental, technical and political aspects of the search for the Higgs boson without going into oppressive scientific detail. It is rare that one comes across a monograph as good as this.

Gillies draws attention to the many interplays and dialectics that led to our present understanding of the Higgs boson. First of all, he brings to light the scientific issues associated with the basic constituents of matter, and the forces and interactions that give rise to the Standard Model. Secondly, he highlights the symbiotic relationship between theoretical and experimental research, each leading the other in turn, and taking the subject forward. Finally, he shows the inter-development of the accelerators, detectors and experimental methods to which massive computing power had eventually to be added. This is all coloured by a liberal sprinkling of anecdotes about the people that made it all possible.

Complementing this is the story of CERN, both as a laboratory and as an institution, traced over the past 60 years or so, through to its current pre-eminent standing. Throughout the book the reader learns just how important the people involved really are to the enterprise: their sheer pleasure, their commitment through the inevitable ups and downs, and their ability to collaborate and compete in the best of ways.

A ripping yarn, then, which it might seem churlish to criticise. But then again, that is the job of a reviewer. There is, perhaps, an excessively glossy presentation of progress, and the exposition continues forward apace without conveying the many downs of cutting-edge research: the technical difficulties and the many immensely hard and difficult decisions that have to be made during such enormous endeavours. Doing science is great fun but also very difficult – but then what are challenges for?

There is, perhaps, an excessively glossy presentation of progress

A pertinent example in the Higgs-boson story not emphasised in the book occurred in 2000. The Large Electron Positron collider (LEP) was due to be closed down to make way for the LHC, but late in the year LEP’s ALEPH detector recorded evidence suggesting a Higgs boson might be being observed at a mass of 114–115 GeV – although, unfortunately, not seen by the other experiments (see p32). Exactly this situation had been envisaged when not one but four LEP experiments were approved in the 1980s. After considerable discussion LEP’s closure went ahead, much to the unhappiness and anger of a large group of scientists who believed they were on the verge of a great discovery. This made for a very difficult environment at CERN for a considerable time thereafter. We now know the Higgs was found at the LHC with a mass of 125 GeV, vindicating the original decision of 2000.

A few more pictures might help the text and fix the various contributors in readers’ minds, though clearly the book, part of a series of short volumes by Icon Books called Hot Science, is formatted for brevity. I also found the positioning of the important material on applications such as positron emission tomography and the world wide web to be unfortunate, situated as it is in the final chapter, entitled “What’s the use?” Perhaps instead the book could have ended on a more upbeat note by returning to the excitement of the science and technology, and the enthusiasm of the people who were inspired to make the discovery happen.

CERN and the Higgs Boson is a jolly good read and recommended to everyone. Whilst far from the first book on the Higgs boson, Gillies’ offering distinguishes itself with its concise history and the insider perspective available to him as CERN’s head of communications from 2003 to 2015: the denouement of the hunt for the Higgs.

From SUSY to the boardroom

Former particle physicist Andy Yen has set himself a modest goal: to transform the business model of the internet. In the summer of 2013, following the Snowden security leaks, he and some colleagues at CERN started to become concerned about the lack of data privacy and the growing inability of individuals to control their own data on the internet. It prompted him, at the time a PhD student from Harvard University working on supersymmetry searches in the ATLAS experiment, and two others, to invent “ProtonMail” – an ultra-secure e-mail system based on end-to-end encryption.

The Courier met with Yen and Bart Butler, ProtonMail’s chief technology officer and fellow CERN alumnus, at the company’s Geneva headquarters to find out how a discussion in CERN’s Restaurant 1 was transformed into a company with more than 100 employees serving more than 10 million users.

If you are a Gmail user, then you are not Google’s customer, you are the product that Google sells to its real customer, which is advertisers

“The business model of the internet today really isn’t compatible with privacy,” explains Yen. “It’s all about the relationship between the provider and customer. If you are a Gmail user, then you are not Google’s customer, you are the product that Google sells to its real customer, which is advertisers. With ProtonMail, the people who are paying us are also our users. If we were ever to betray the trust of the user base, which is paying us precisely for reasons of privacy, then the whole business model collapses.”

Anyone can sign up for a ProtonMail account. Doing so generates a pair of public and private keys based on secure RSA-type encryption implementations and open-source cryptographic libraries. User data is encrypted using a key that ProtonMail does not have access to, which means the company cannot decrypt or access a user’s messages (nor offer data recovery if a password is forgotten). The challenge, says Yen, was not so much in developing the underlying algorithms, but in applying this level of security to an e-mail service in a user-friendly way.

In 2014 Yen and ProtonMail’s other co-founders, Jason Stockman and Wei Sun, entered a competition at MIT to pitch the idea. They lost, but reasoned that they had already built the thing and got a couple of hundred CERN people using it, so why not open it up to the world and see what happens? Within three days of launching the website 10,000 people had signed up. It was surprising and exciting, says Yen, but also scary. “E-mail has to work. A bank or something might close down their websites for an hour of maintenance once in a while, but you can’t do that with e-mail,” he says.

ProtonMail’s CERN origins (the name came from the fact that its founders were working on the Large Hadron Collider) meant that the technology could first come under the scrutiny of technically minded people – “early adopters”, who play a vital role in the life cycle of new products. But what might be acceptable to tech-minded people is not necessarily what the broader users want, says Yen. He quickly realised that the company had to grow, and that he had been forced into a “tough and high-risk” decision between ProtonMail and his academic career. Eventually deciding to take the leap, Harvard granted him a period of absence, and Yen set about dealing with the tens of thousands of users who were waiting to get onto the service.

In need of cash, the fledgling software outfit decided to try something unusual: crowd funding. This approach broke new ground in Switzerland, and ProtonMail soon became a test case in tax law as to whether such payments should be considered revenue or donation (the authorities eventually ruled on the former). But the effort was a huge success, raising 0.5 million Swiss Francs in a little over two months. “Venture capital (VC) was a mystery to us,” says Yen. “We didn’t know anybody, we didn’t have a business plan, we were just a few people writing code. But, funnily enough, the crowd sourcing, in addition to the money itself, got a lot of attention and this attracted interest from VCs.” A few months later, ProtonMail had received 2 million Swiss Francs in seed funding.

“It is one thing to have an idea – then we had to actually do what we’d promised: build a team, hire people, scale up the product and have some sort of company to run things, with corporate identity, accounting, tax compliance, etc. There wasn’t really a marketing plan… it was more of a technical challenge to build the service,” says Yen. “If I was to give advice to someone in my position five years ago, then there isn’t a lot I could say. Starting a company is something new for almost everybody who does it, and I don’t think physicists are at a disadvantage compared to someone who went to business school. All you have to do is work hard, keep learning and you have to have the right people around you.”

It’s not a traditional company – 10–15% of the staff today is CERN scientists

It was around that time, in 2015, when Butler, also a former ATLAS experimentalist working on supersymmetry and one-time supervisor of Yen, joined ProtonMail. “A lot of that year was based around evolving the product, he says. “There was a big difference between what the product originally was versus what it needed to be to scale up. It’s not a traditional company – 10–15% of the staff today is CERN scientists. A lot of former physicists have developed into really good software engineers, but we’ve had to bring in properly trained software engineers to add the rigour that we need. At the end of the day, it’s easier to teach a string theorist how to code than it is to teach advanced mathematics and complex cryptographic concepts to someone who codes.”

With the company, Proton Technologies, by then well established – and Yen having found time to hotfoot it back to Harvard for one “very painful and ridiculous” month to write up his PhD thesis – the next milestone came in 2016 when ProtonMail was actually launched. It was time to begin charging for accounts, and to provide those who already had signed up with premium paid-for services. It was the ultimate test of the business model: would enough people be prepared to pay for secure e-mail to make ProtonMail a viable and even profitable business? The answer turned out to be “yes”, says Yen. “2016 was make or break because eventually the funding was going to run out. We discussed whether we should raise money to buy us more time. But we decided just to work our asses off instead. We came very close but we started generating revenue just as the VC cash ran out.”

Since then, ProtonMail has continued to scale up its services, for instance introducing mobile apps, and its user base has grown to more than 10 million. “Our main competitors are the big players, Google and Microsoft,” says Yen. “If you look at what Google offers today, it’s actually really nice to use. So the longer vision is: can we offer what Google provides — services that are secure, private and beneficial to society? There is a lot to build there, ProtonDrive, ProtonCalendar, for example, and we are working to put together that whole ecosystem.”

A big part of the battle ahead is getting people to understand what is happening with the internet and their data, says Butler. “Nobody is saying that when Google or Facebook began they went out to grab people’s data. It’s just the way the internet evolved: people like free things. But the pitfalls of this model are becoming more and more apparent. If you talk to consumers, there is no choice in the market. It was just e-mail that sold your data. So we want to provide that private option online. I think this choice is really important for the world and it’s why we do what we do.”

 

Cloud services take off in the US and Europe

Fermilab has announced the launch of HEPCloud, a step towards a new computing paradigm in particle physics to deal with the vast quantities of data pouring in from existing and future facilities. The aim is to allow researchers to “rent” high-performance computing centres and commercial clouds at times of peak demand, thus reducing the costs of providing computing capacity. Similar projects are also gaining pace in Europe.

“Traditionally, we would buy enough computers for peak capacity and put them in our local data centre to cover our needs,” says Fermilab’s Panagiotis Spentzouris, one of HEPCloud’s drivers. “However, the needs of experiments are not steady. They have peaks and valleys, so you want an elastic facility.” All Fermilab experiments will soon submit jobs to HEPCloud, which provides a uniform interface so that researchers don’t need expert knowledge about where and how best to run their jobs.

The idea dates back to 2014, when Spentzouris and Fermilab colleague Lothar Bauerdick assessed the volumes of data coming from Fermilab’s neutrino programme and the US participation in CERN’s Large Hadron Collider (LHC) experiments. The first demonstration of HEPCloud on a significant scale was in February 2016, when the CMS experiment used it to achieve about 60,000 cores on the Amazon cloud, AWS, and, later that year, to run 160,000 cores using Google Cloud Services. Most recently in May 2018, the NOvA team at Fermilab was able to execute around 2 million hardware threads at a supercomputer at the National Energy Research Scientific Computing Center of the US Department of Energy’s Office of Science. HEPCloud project members now plan to enable experiments to use the state-of-the art supercomputing facilities run by the DOE’s Advanced Scientific Computing Research programme at Argonne and Oak Ridge national laboratories.

Europe’s Helix Nebula

CERN is leading a similar project in Europe called the Helix Nebula Science Cloud (HNSciCloud). Launched in 2016 and supported by the European Union (EU), it builds on work initiated by EIROforum in 2010 and aims to bridge cloud computing and open science. Working with IT contractors, HNSciCloud members have so far developed three prototype platforms and made them accessible to experts for testing.

The results and lessons learned are contributing to the implementation of the European Open Science Cloud

“The HNSciCloud pre-commercial procurement finished in December 2018, having shown the integration of commercial cloud services from several providers (including Exoscale and T-Systems) with CERN’s in-house capacity in order to serve the needs of the LHC experiments as well as use cases from life sciences, astronomy, proton and neutron science,” explains project leader Bob Jones of CERN. “The results and lessons learned are contributing to the implementation of the European Open Science Cloud where a common procurement framework is being developed in the context of the new OCRE [Open Clouds for Research Environments] project.”

The European Open Science Cloud, an EU-funded initiative started in 2015, aims to bring efficiencies and make European research data more sharable and reusable. To help European research infrastructures move towards this open-science future, a €16 million EU project called ESCAPE (European Science Cluster of Astronomy & Particle Physics ESFRI) was launched in February. The 3.5 year-long project led by the CNRS will see 31 facilities in astronomy and particle physics collaborate on cloud computing and data science, including CERN, the European Southern Observatory, the Cherenkov Telescope Array, KM3NeT and the Square Kilometre Array (SKA).

In the context of ESCAPE, CERN is leading the effort of prototyping and implementing a FAIR (findable, accessible, interoperable, reproducible) data infrastructure based on open-source software, explains Simone Campana of CERN, who is deputy project leader of the Worldwide LHC Computing Grid (WLCG). “This work complements the WLCG R&D activity in the area of data organisation, management and access in preparation for the HL-LHC. In fact, the computing activities of the CERN experiments at HL-LHC and other initiatives such as SKA will be very similar in scale, and will likely coexist on a shared infrastructure.”

Galaxies thrive on new physics

This supercomputer-generated image of a galaxy suggests that general relativity might not be the only way to explain how gravity works. Theorists at Durham University in the UK simulated the universe using hydrodynamical simulations based on “f(R) gravity” – in which a scalar field enhances gravitational forces in low-density regions (such as the outer parts of a galaxy) but is screened by the so-called chameleon mechanism in high-density environments such as our solar system (see C Arnold et al. Nature Astronomy; arXiv:1907.02977).

The left-hand side of the image shows the scalar field of the theory: bright-yellow regions correspond to large scalar-field values, while dark-blue regions correspond to to a very small scalar fields, i.e. regions where screening is active and the theory behaves like general relativity. The right-hand side of the image shows the gas density with stars overplotted. The simulation, which was based on a total of 12 simulations for different model parameters and resolutions, and which required a total runtime of about 2.5 million core-hours, shows that spiral galaxies like our Milky Way could still form even with different laws of gravity.

“Our research definitely does not mean that general relativity is wrong, but it does show that it does not have to be the only way to explain gravity’s role in the evolution of the universe,” says lead author Christian Arnold of Durham University’s Institute for Computational Cosmology.

Music of the muons

Subatomic Desire

Swiss composer Alexandre Traube and the Genevan video-performer Silvia Fabiani have collaborated to form music and dance troupe Les Atomes Dansants, with the aims of using CMS data to explore the links between science and art, and of establishing a dialogue between Eastern and Western culture. Premiering their show Subatomic Desire at CERN’s Globe of Science and Innovation on 21 June during Geneva’s annual Fête de la Musique, they took the act to the detector that served as their muse by performing in the hangar above the CMS experiment.

Muon tracks from W, Z and Higgs events served as inspiration for Traube, who was advised by CMS physicist Chiara Mariotti of INFN. He began by associating segments of the CMS’s muon system to notes. Inspired by the detectors’ arrangement as four nested dodecagons, he assigned a note from the chromatic scale to each of the 12 sides of the innermost layer, and to each note a sonorous perfect fourth above to the corresponding segment in the outer layer. Developing an initial plan to also link the intermediate two layers of the muon system to specific frequencies, he associated two intermediate microtonal notes to the transverse momentum and rapidity of the tracks. At several moments during the performance the musicians improvise using the resulting four-note sequences: an expression of quantum indeterminacy, according to Traube. Fabiani’s video projections add to the surreal atmosphere by transposing the sequences into colours, with an animation of bullets referencing the Russian Second World War navy shells that were used to build the CMS’s hadronic calorimeter.

Clad in lab coat, Einstein wig and reversed baseball cap, Doc MC Carré raps formulas and boogies around the stage

In concert with the audiovisual display, three performers sing about their love for the microcosm. Clad in lab coat, Einstein wig and reversed baseball cap, Doc MC Carré (David Charles) raps formulas and boogies around the stage. He is accompanied by Doc Lady Emmy, played by the soprano Marie-Najma Thomas, and Poète Atomique – the Persian singer Taghi Akhabari – who peppers the performance with mystical extracts from Sufi poets Rûmi and Attâr, and medieval German abbess Hildegard of Bingen, each of whom explores themes of the natural world in their writings. The performers contend that the lyrics speak about desire as the fuel for everything at the micro- and macroscale. Elaborate, contemporary and rich in metaphors, this is an experience that some will find abstruse but others will love.

Subatomic Desire will next be performed in Neuchâtel, Switzerland on 14 September.

Interdisciplinary physics at the AEDGE

Frequency niche

Following the discovery of gravitational waves by the LIGO and Virgo collaborations, there is great interest in observing other parts of the gravitational-wave spectrum and seeing what they can tell us about astrophysics, particle physics and cosmology. The European Space Agency (ESA) has approved the LISA space experiment that is designed to observe gravitational waves in a lower frequency band than LIGO and Virgo, while the KAGRA experiment in Japan, the INDIGO experiment in India and the proposed Einstein Telescope (ET) will reinforce LIGO and Virgo. However, there is a gap in observational capability in the intermediate-frequency band where there may be signals from the mergers of massive black holes weighing between 100 and 100,000 solar masses, and from a first-order phase transition or cosmic strings in the early universe.

This was the motivation for a workshop held at CERN on 22 and 23 July that brought experts from the cold-atom community together with particle physicists and representatives of the gravitational-wave community. Experiments using cold atoms as clocks and in interferometers offer interesting prospects for detecting some candidates for ultralight dark matter as well as gravitational waves in the mid-frequency gap. In particular, a possible space experiment called AEDGE could complement the observations by LIGO, Virgo, LISA and other approved experiments.

The workshop shared information about long-baseline terrestrial cold-atom experiments that are already funded and under construction, such as MAGIS in the US, MIGA in France and ZAIGA in China, as well as ideas for future terrestrial experiments such as MAGIA-advanced in Italy, AION in the UK and ELGAR in France. Delegates also heard about space – CACES (China) and CAL (NASA) – and sounding-rocket experiments – MAIUS (Germany) – using cold atoms in space and microgravity.

A suggestion for an atom interferometer using a pair of satellites is being put forward by the AEDGE team

ESA has recently issued a call for white papers for its Voyage 2050 long-term science programme, and a suggestion for an atom interferometer using a pair of satellites is being put forward by the AEDGE team (in parallel with a related suggestion called STE-QUEST) to build upon the experience with prior experiments. AEDGE was the focus of the CERN workshop, and would have unique capabilities to probe the assembly of the supermassive black holes known to power active galactic nuclei, physics beyond the Standard Model in the early universe and ultralight dark matter. AEDGE would be a uniquely interdisciplinary space mission, harnessing cold-atom technologies to address key issues in fundamental physics, astrophysics and cosmology.

Higgs hunters still hungry in Paris

Participants at Higgs Hunting 2019

The 10th Higgs Hunting workshop took place in Orsay and Paris from 29–31 July, attracting 110 physicists for lively discussions about recent results in the Higgs sector. The ATLAS and CMS collaborations presented Run 2 analyses with up to 140 fb–1 of data collected at a centre-of-mass energy of 13 TeV. The statistical uncertainty on some Higgs properties, such as the production cross-section, has now been reduced by a factor three compared to Run 1. This puts some Higgs studies on the verge of being dominated by systematic uncertainties. By the end of the LHC’s programme, measurements of the Higgs couplings to the photon, W, Z, gluon, tau lepton and top and bottom quarks are all expected to be dominated by theoretical rather than statistical or experimental uncertainties.

Several searches for additional Higgs bosons were presented. The general recipe here is to postulate a new field in addition to the Standard Model (SM) Higgs doublet, which in the minimal case yields a lone physical Higgs universally associated with the particle discovered at the LHC with a mass of 125 GeV in 2012. Adding a hypothetical additional Higgs doublet, however, as in the two Higgs doublet model, would yield five physical states: CP-even neutral Higgs bosons h and H, the CP-odd pseudoscalar A, and two charged Higgs bosons H±; the model would also bequeath three additional free parameters. Other models discussed at Higgs Hunting 2019 include the minimal and next-to-minimal supersymmetric SMs and extra Higgs states with doubly charged Higgs bosons. Anna Kaczmarska from ATLAS and Suzanne Gascon-Shotkin from CMS described direct searches for such additional Higgs bosons decaying to SM particles or Higgs bosons. Loan Truong from ATLAS and Yuri Gershtein from CMS described studies of rare – and potentially beyond-SM – decays of the 125 GeV Higgs boson. No significant excesses were reported, but hope remains for Run 3, which will begin in 2021.

Nobel laureate Gerard ’t Hooft gave a historical talk on the role of the Higgs in the renormalisation of electroweak theory, recalling the debt his Utrecht group, where the work was done almost 50 years ago, owed to pioneers like Faddeev and Popov. Seven years after the particle’s discovery, we now know it to be spin-0 with mainly CP-even interactions with bosons, remarked Fabio Cerutti of Berkeley in the experimental summary. With precision on the Higgs mass now better than two parts per mille, all of the SM’s free parameters are known with high precision, he continued, and all but three of them are linked to Higgs-boson interactions.

Give me six hours to chop down a tree and I will spend the first four sharpening the axe.

Abraham Lincoln

Hunting season may now be over, Cerutti concluded, but the time to study Higgs anatomy and exploit the 95% of LHC data still to come is close at hand. Giulia Zanderighi’s theory summary had a similar message: Higgs studies are still in their infancy and the discovery of what seems to be a very SM-like Higgs at 125 GeV allows us to explore a new sector with a broad experimental programme that will extend over decades. She concluded with a quote from Abraham Lincoln: “Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

The next Higgs Hunting workshop will be held in Orsay and/or Paris from 7–9 September 2020.

Supergravity pioneers share $3m Breakthrough Prize

Peter van Nieuwenhuizen, Sergio Ferrara and Dan Freedman (left to right) at CERN in 2016 on the occasion of supergravity’s 40th anniversary. Credit: S Bennett/CERN

Theorists Sergio Ferrara (CERN), Dan Freedman (MIT/Stanford) and Peter van Nieuwenhuizen (Stony Brook) have been awarded a Special Breakthrough Prize in Fundamental Physics for their 1976 invention of supergravity. Supergravity marries general relativity with supersymmetry and, after more than 40 years, continues to carve out new directions in the search for a unified theory of the basic interactions.

“This award comes as a complete surprise,” says Ferrara. “Supergravity is an amazing thing because it extends general relativity to a higher symmetry – the dream of Einstein – but none of us expected this.”

Supergravity followed shortly after the invention of supersymmetry. This new symmetry of space–time, which enables fermions to be “rotated” into bosons and vice versa, implies that each elementary particle has a heavier supersymmetric partner and its arrival came at a pivotal moment for the field. The Standard Model (SM) of electroweak and strong interactions had just come into being, yet it was clear from the start that it was not a complete: it is not truly unified because the gluons of the strong force and the photons of electromagnetism do not emerge from a common symmetry, and it leaves out gravity, which is described by general relativity. Supersymmetry promised a way to tackle these and other problems with the SM.

It was clear that the next step was to extend supersymmetry to include gravity, says Ferrara, but it was not obvious how this could be done. During a short period lasting from autumn 1975 to spring the following year, Ferrara, Freedman and van Nieuwenhuizen succeeded – with the help of state-of-the-art computers – in producing a supersymmetric theory that included the gravitino as the supersymmetric partner of the graviton. The trio published their paper in June 1976. Chair of the prize selection committee, Edward Witten, says of the achievement:

“The discovery of supergravity was the beginning of including quantum variables in describing the dynamics of space–time. It is quite striking that Einstein’s equations admit the generalisation that we know as supergravity.”

It is quite striking that Einstein’s equations admit the generalisation that we know as supergravity

Despite numerous searches at ever higher energies during the past decades, no supersymmetric particles have ever been observed. But the importance of supergravity and its influence on physics is already considerable – especially on string theory, of which supergravity is a low-energy manifestation. Supergravity was a crucial ingredient in the 1984 proof by Michael Green and John Schwarz that string theory is mathematically consistent, and it was also instrumental in the M-theory string unification by Edward Witten in 1995. It played a role in Andrew Strominger and Cumrun Vafa’s 1996 derivation of the Bekenstein–Hawking entropy for quantum black holes, and is also important in the holographic AdS/CFT duality discovered by Juan Maldacena in 1997.

“Supergravity led to great improvements in mathematical physics, especially supergroups and supermoduli, and in the growing field of string phenomenology, which attempts to include particle physics in superstring theory,” adds Ferrara.

Ferrara, Freedman and van Nieuwenhuizen have received several awards for the invention of supergravity, including the 1993 ICTP Dirac Medal and the 2006 Dannie Heinemann Prize for Mathematical Physics. The Breakthrough Prize, founded in 2012 by former theoretical particle physicist and founder of DST Global, Yuri Milner, rewards achievements in fundamental physics, life sciences and mathematics. The $3m Special Breakthrough Prize can be awarded at any time “in recognition of an extraordinary scientific achievement”, and is not limited to recent discoveries. Previous winners of the Special Breakthrough Prize in Fundamental Physics are: Stephen Hawking; seven physicists whose leadership led to the discovery of the Higgs boson at CERN; the LIGO and Virgo collaborations for the detection of gravitational waves; and Jocelyn Bell Burnell for the discovery of pulsars.

The new laureates, along with the winners of the Breakthrough Prize in Life Sciences and Mathematics, will receive their awards at a ceremony at NASA’s “Hangar 1” on 3 November.

Austrian synchrotron debuts carbon-ion cancer treatment

The ion-beam injectors of the MedAustron facility in Austria. Credit: MedAustron/T Kästenbauer

MedAustron, an advanced hadron-therapy centre in Austria, has treated its first patient with carbon ions. The medical milestone, which took place on 2 July 2019, elevates the particle-physics-linked facility to the ranks of only six centres worldwide that can combat tumours with both protons and carbon ions.

When protons and carbon ions strike biological material, they lose energy much more quickly than photons, which are traditionally used in radiotherapy. This makes it possible to deposit a large dose in a small and well-targeted volume, reducing damage to healthy tissue surrounding a tumour and thereby reducing the risk of side effects. While proton therapy has been successfully used at MedAustron since December 2016, treating more than 400 cancer patients so far, carbon-ion therapy opens up new opportunities to target tumours that were previously difficult or impossible to treat. Carbon ions are biologically more effective than protons and therefore allow a higher dose to be administered to the tumour.

MedAustron’s accelerator complex is based on the CERN-led Proton Ion Medical Machine Study, the design subsequently developed by CERN, the TERA Foundation, INFN in Italy and the CNAO Foundation (see “Therapeutic particles”). Substantial help was also provided by the Paul Scherrer Institute, in particular for the gantry and beam-delivery designs. The MedAustron system comprises an injector, where ions from three ion sources are pre-accelerated by a linear accelerator, a synchrotron, a high-energy beam transport system to deliver the beam to various beam ports, and a medical front-end, which controls the irradiation process and covers all safety aspects with respect to the patient. Certified as a medical product, the accelerator provides proton and carbon ion beams with a penetration depth of about up to 37 cm in water-equivalent tissue, and is able to deliver carbon-ions with 255 different energies ranging from 120 to 400 MeV with maximum intensities of up to 109 ions per extracted beam pulse.

The MedAustron proton/carbon-ion synchrotron

“The first successful carbon-ion treatment unveils MedAustron’s full potential for cancer treatment,” says Michael Benedikt of CERN, who co-ordinated the laboratory’s contributions to the project. “The realisation of MedAustron, through the collaboration with CERN for the construction of the accelerator facility, is an excellent example of large-scale technology transfer from fundamental research to societal applications.”

Particle therapy with carbon ions was first used in Japan in 1994, and a total of almost 30,000 patients worldwide have since been treated with this method. Initially, treatment with carbon ions at MedAustron will focus on tumours in the head and neck region, and at the base of the skull. But the spectrum will be continuously expanded to include other tumour types. MedAustron is also working on the completion of an additional treatment room with a gantry that administers proton beams from a large variety of irradiation angles.

“Irradiation with carbon ions makes it possible to maintain both the physical functions and the quality of life of patients, even with very complicated tumours,” says Piero Fossati, scientific and clinical director of MedAustron’s carbon ion programme.

bright-rec iop pub iop-science physcis connect