Comsol -leaderboard other pages

Topics

Exploring quantum computing for high-energy physics

The ambitious upgrade programme for the Large Hadron Collider (LHC) will result in significant information and communications technology (ICT) challenges over the next decade and beyond. It is therefore vital that members of the HEP research community keep looking for innovative computing technologies so as to continue to maximise the discovery potential of the world-leading research infrastructures at their disposal (CERN Courier November 2018 p5).

On 5–6 November, CERN hosted a first-of-its kind workshop on quantum computing in high-energy physics (HEP). The event was organised by CERN openlab, a public–private partnership between CERN and leading ICT companies established to accelerate the development of computing technologies needed by the LHC research community.

More than 400 people followed the workshop, which provided an overview of the current state of quantum-computing technologies. The event also served as a forum to discuss which activities within the HEP community may be amenable to the application of quantum-computing technologies.

“In CERN openlab, we’re always looking with keen interest at new computing architectures and trying to understand their potential for disrupting and improving the way we do things,” says Alberto Di Meglio, head of CERN openlab. “We want to understand which computing workflows from HEP could potentially most benefit from nascent quantum-computing technologies; this workshop was the start of the discussion.”

Significant developments are being made in the field of quantum computing, even if today’s quantum-computing hardware has not yet reached the level at which it could be put into production. Nevertheless, quantum-computing technologies are among those that hold future promise of substantially speeding up tasks that are computationally expensive.

“Quantum computing is no panacea, and will certainly not solve all the future computing needs of the HEP community,” says Eckhard Elsen, CERN’s director for research and computing. “Nevertheless, quantum computers are starting to be available; a breakthrough in the number of qubits could emerge at any time. Fundamentally rethinking our algorithms may appear as an interesting intellectual challenge today, yet may turn out as a major benefit in addressing computing challenges in the future.”

The workshop featured representatives of the LHC experiments, who spoke about how computing challenges are likely to evolve as we approach the era of the High-Luminosity LHC. There was also discussion of work already undertaken to assess the feasibility of applying today’s quantum-computing technologies to problems in HEP. Jean-Roch Vlimant provided an overview of their recent work at the California Institute of Technology, with collaborators from the University of Southern California, to solve an optimisation problem related to the search for Higgs bosons. Using an approach known as quantum annealing for machine learning, the team demonstrated some advantage over traditional machine-learning methods for small training datasets. Given the relative simplicity of the algorithm and its robustness to error, they report, this technique may find application in other areas of experimental particle physics, such as real-time decision making in event-selection problems and classification in neutrino physics.

Several large-scale research initiatives related to quantum-computing technologies were presented at the event, including the European Union’s €1 billion Quantum Technologies Flagship project, which involves universities and commercial partners across Europe. Presentations were also given of ambitious programmes in the US, such as the Northeast Quantum Systems Center at Brookhaven National Laboratory and the Quantum Science Program at Fermilab, which includes research areas in superconducting quantum systems, quantum algorithms for HEP, and computational problems and theory.

Perhaps most importantly, the workshop brought members of the HEP community together with leading companies working on quantum-computing technologies. Intel, IBM, Strangeworks, D-Wave, Microsoft, Rigetti and Google all presented their latest work in this area at the event. Of these companies, Intel and IBM are already working closely with CERN through CERN openlab. Plus, Google also announced at the event that they have signed an agreement to join CERN openlab.

“Now is the right time for the HEP community to get involved and engage with different quantum-computing initiatives already underway, fostering common activities and knowledge sharing,” says Federico Carminati, CERN openlab CIO and chair of the event. “With its well-established links across many of the world’s leading ICT companies, CERN openlab is ideally positioned to help drive this activity forward. We believe this first event was a great success and look forward to organising future activities in this exciting area.”

Recordings of the talks given at the workshop are available via the CERN openlab website at: openlab.cern.

Theory event fuses physics and gender

CERN hosted a workshop on high-energy theory and gender on 26–28 September. It was the first activity of the “Gen-HET” working group, whose goals are to improve the presence and visibility of women in the field of high-energy theory and increase awareness of gender issues.

Most of the talks in the workshop were on physics. Invited talks spanned the whole of high-energy theory, providing an opportunity for participants to learn about new results in neighbouring research areas at this interesting time for the field. Topics ranged from the anti-de-Sitter/conformal field theory (AdS/CFT) correspondence and inflationary cosmology to heavy-ion, neutrino and beyond-Standard Model physics.

Agnese Bissi (Uppsala University, Sweden) began the physics programme by reviewing the now-two-decades-old AdS/CFT correspondence, and discussing the use of conformal bootstrap methods in holography. Korinna Zapp (LIP, Lisbon, Portugal and CERN) then put three recent discoveries in heavy-ion physics into perspective: the hydrodynamic behaviour of soft particles; jet quenching; and surprising similarities between soft particle production in high-multiplicity proton–proton and heavy-ion collisions.

JiJi Fan (Brown University, USA) delved into the myriad world of beyond-the Standard Model phenomenology, discussing the possibility that the Higgs is “meso-tuned” but that there are no other light scalars. Elvira Gamiz (University of Granada, Spain) reviewed key features of lattice simulations for flavour physics and mentioned significant tensions with some experimental results that are as high as 3σ in certain B-decay channels. The theory colloquium, by Ana Achucarro (University of Leiden, Holland, and UPV-EHU Bilbao, Spain), was devoted to the topic of inflation, which still presents a major challenge to theorists.

The importance of parton distribution functions in an era of high-precision physics was the focus of a talk by Maria Ubiali (University of Cambridge, UK), who explained the state-of-the-art methods used. Reviewing key topics in cosmology and particle physics, Laura Covi (Georg-August-University Göttingen, Germany) then described how models with heavy R-parity violating supersymmetry lead to scenarios for baryogenesis and gravitino dark matter.

In neutrino physics, Silvia Pascoli (Durham University, UK) gave an authoritative overview of the experimental and theoretical status, while Tracy Slatyer (MIT, USA) did the same for dark matter, emphasising the necessity of search strategies that test many possible dark-matter models.

Closing the event, Alejandra Castro (University of Amsterdam, the Netherlands) talked about black-hole entropy and its fascinating connections with holography and number theory. The final physics talk, by Eleni Vyronidou (CERN), covered Standard Model effective field theory (SMEFT), which provides a pathway to new physics above the direct energy-reach of colliders.

The rest of the workshop centred on talks and discussion sessions about gender issues. The full spectrum of issues was addressed, a few examples of which are given here.

Julie Moote from University College London, UK, delivered a talk on behalf of the Aspires project in the UK, which is exploring how social identities and inequalities affect students continuing in science, while Marieke van den Brink from Radboud University Nijmegen, the Netherlands, described systematic biases that were uncovered by her group’s studies of around 1000 professorial appointments in the Netherlands. Meytal Eran-Jona from the Weizmann Institute of Science, Israel, reviewed studies about unconscious bias and its implications for women in academia, and described avenues to promote gender equality in the field.

The last day of the meeting focused on actions that physicists can take to improve diversity in their own departments. For example, Jess Wade from Imperial College London, UK, discussed UK initiatives such as the Institute of Physics Juno and Athena SWAN awards, and Yossi Nir from the Weizmann Institute gave an inspiring account of his work on increasing female participation in physics in Israel. One presentation drawing on bibliometric data in high-energy theory attracted much attention beyond the workshop, as has been widely reported elsewhere.

This first workshop on high-energy theory and gender combined great physics, mentoring and networking. The additional focus on gender gave participants the opportunity to learn about the sociological causes of gender imbalance and how universities and research institutes are addressing them.

We are very grateful to many colleagues for their support in putting together this meeting, which received help from the CERN diversity office and financial support from the CERN theory department, the Mainz “cluster of excellence” PRISMA, Italy’s National Institute for Nuclear Physics (INFN), the University of Milano-Bicocca, the ERC and the COST network.

Similar activities are planned in the future, including discussions on other scientific communities and minority groups.

Karlheinz Meier 1955–2018

Karlheinz Meier, a visionary experimental particle physicist and co-founder of the Human Brain Project, unexpectedly passed away on 24 October, much too early, at the age of 63.

Karlheinz’s career began at the University of Hamburg in Germany, where he studied physics. He completed his PhD there in 1984, with Gus Weber and Wulfrin Bartel as his supervisors, working for the JADE experiment at the PETRA electron–positron collider at DESY. During the following six years, he worked at CERN for the UA2 project, for two years as a CERN fellow and then as staff scientist. Returning to DESY in 1990, he joined the H1 collaboration. In 1992 he accepted a full professorship at Heidelberg University, where in 1994 he founded the Heidelberg ASIC Laboratory for Microelectronics and later, in 1999, the Kirchhoff Institute for Physics; during this period he also joined the ATLAS collaboration at CERN’s Large Hadron Collider (LHC). He was vice-rector at Heidelberg University from 2001 to 2004, chair of the European Committee for Future Accelerators (ECFA) from 2007 to 2009, and a member of the governing board of the German Physical Society (DPG) from 2009 to 2013. Within the Human Brain Project – a major 10 year-long effort harnessing cutting-edge research infrastructure for the benefit of neuroscience, computing and brain-related medicine – he initiated the European Institute for Neuromorphic Computing (EINC) at Heidelberg. Sadly, the completion of the facility cannot be witnessed by him anymore.

Karlheinz was an extremely enthusiastic, visionary and energetic scientist. He made fundamental contributions to the instrumentation and data analysis of large particle-physics experiments, especially concerning calorimeter systems. Early on, during his PhD, he developed advanced algorithms for identifying photons with the lead-glass calorimeter of JADE, an essential ingredient for his analysis of the inclusive production of photons, pions and η-mesons in multi-hadronic final states, but also for many studies of hadronisation and jet production, which JADE became famous for. Later, at CERN’s UA2 experiment, he participated in the first analyses of the newly discovered W and Z bosons.

Back at DESY, he was one of the advocates and initiators of the H1 scintillating fibre “spaghetti” calorimeter, which was decisive for precise measurements of the proton structure. In addition, his research group built another specialised backward calorimeter for H1, the VLQ; by analysing the VLQ data, he was able to refute the then theoretical predictions on special multi-gluon states, such as the odderon (CERN Courier April 2018 p9). Karlheinz recognised early on the need for developing highly integrated electronic circuits for experimental physics, and his group – together with colleagues from the Heidelberg ASIC Laboratory – developed the pre-processor system of the ATLAS level-1 calorimeter trigger, which played a pivotal role in the discovery of the Higgs boson.

Since 2001, Karlheinz became increasingly interested in fundamental questions related to the physics of complex systems and information processing, with a focus on the development of neuromorphic hardware for decoding the functioning of the brain. In contrast to normal, programme-oriented Turing machines, neuromorphic systems are extremely energy efficient, error tolerant and self-adaptive – just like the human brain. His research results received special international recognition through the Human Brain Project, which he initiated together with Henry Markram and Richard Frackowiak, and which was selected by the European Union in 2012 as one of two so-called Flagship Projects of European research funding.

Karlheinz was also exceptional in supervising and motivating young researchers. He was a highly gifted teacher, whose lectures and seminars were loved by his students. Through his renowned “Team-Anderthalb” 90-second movies on a wide variety of basic physics topics, he became known to the wider public; they are available, like many other of his lectures and talks, on YouTube.

Curiosity for the fundamental questions of physics and technological innovation were the two driving forces that accompanied Karlheinz throughout his research life. He not only contributed significantly to the expansion of our knowledge about nature, but also gave new impetus to technological development, especially in the field of microelectronics and computing. His commitment to both research and teaching was outstanding and special. His passion, humanity, humour, overall guidance and inspiration will be sorely missed and not forgotten.

Inside Story: On the Courier’s new future

“I think the Courier is excellent; it’s sort of ‘frozen in time’, but in a rather appropriate and appealing way.” Of all the lively comments received from the 1400 or so readers who took part in our recent survey (see below), this one sums things up for the CERN Courier. “Excellent” might be a stretch for some, but, coming up for its 60th anniversary, this well-regarded periodical is certainly unique. It has been alongside high-energy physics as the field has grown up, from the rise of the Standard Model to the strengthening links with cosmology and astrophysics, the increasing scale and complexity of accelerators, detectors and computing, the move to international collaborations involving thousands of people, and other seismic shifts.

In terms of presentation, though, the Courier is indeed ripe for change. The website preview-courier.web.cern.ch was created in 1998 when the magazine’s production and commercial dimensions were outsourced to IOP Publishing in the UK. Updated only 10 times per year with the publication of each print issue, the website has had a couple of makeovers (one in 2007 and one earlier this year) but its functionality remains essentially unchanged for 20 years.

A semi-static, print-led website is no longer best suited to today’s publishing scene. The sheer flexibility of online publishing allows more efficient ways to communicate different stories to new audiences. Our survey concurs: a majority of readers (63%) indicated that they were willing to receive fewer print copies per year if preview-courier.web.cern.ch was updated more regularly – a view held most strongly among younger responders. It is this change to its online presence that drives the new publishing model of CERN Courier from 2019, with a new, dynamic website planned to launch in the spring.

At the same time, there is high value attached to a well-produced print magazine that worldwide readers can receive free of charge. And, as the results of our survey show, a large section of the community reads the Courier only when they pick up a copy in their labs or universities to browse over lunch or while travelling. That’s why the print magazine is staying, though at a reduced frequency of six rather than 10 issues per year. To reflect this change, the magazine will have a new look from next year. Among many improvements, we have adopted a more readable font, a clearer layout and other modern design features. There are new and revised sections covering careers, opinion and reviews, while the feature articles – the most popular according to our survey – will remain as the backbone of the issue.

It is sometimes said that the Courier can be a bit too formal, a little dry. Yet our survey did not reveal a huge demand to lighten things up – so don’t expect to see Sudoku puzzles or photos of your favourite pet any time soon. That said, the Courier is a magazine, not an academic journal; in chronicling progress in global high-energy physics it strives to be as enjoyable as it is authoritative.

Another occasional criticism is that the Courier is a mere mouthpiece for CERN. If it is, then it is also – and unashamedly – a mouthpiece for other labs and for the field as a whole. Within just a few issues of its publication, the Courier outgrew its original editorial remit and expanded to cover activities at related laboratories worldwide (with the editorially distinct CERN Bulletin serving the internal CERN community). The new-look Courier will also retain an important sentence on its masthead demarcating the views stated in the magazine from those of CERN management.

A network of around 30 laboratory correspondents helps to keep the magazine updated with news from their facilities on an informal basis. But the more members of the global high-energy physics community who interact, the better the Courier can serve them. Whether it’s a new result, experiment, machine or theorem, an event, appointment or prize, an opinion, review or brazen self-promotion, get in touch at cern.courier@cern.ch.

Reader survey: the results are in

To shape the Courier’s new life in print and online, a survey was launched this summer in conjunction with IOP Publishing to find out what readers think of the magazine and website, and what changes could be made. The online survey asked 21 questions and responders were routed to different sections of the survey depending on the answers they provided. Following promotion on preview-courier.web.cern.ch, CERN’s website, CERN Bulletin, social media channels and e-mails to CERN users, there were a total of 1417 responses.

Chart showing age of survey responders

Responders were split roughly 3:1 male to female, with a fairly even age distribution. Geographically, they were based predominantly in France, the US, Italy, Switzerland, Germany and the UK. Some 43% of the respondents work at a university, followed by a national or international research institute (34%), with the rest working in teaching (5%) and various industries. While three-quarters of the respondents named experimental particle physics as their main domain of work, many have other professional interests ranging from astronomy to marketing.

Responders were evenly split between those that read the printed magazine and those that don’t. Readers tend to read the magazine on a regular basis and, overall, have been reading for a significant period of time. A majority (54.1%) do not read the magazine via a direct subscription, and the data suggest that one copy of the Courier is typically read by more than one person.

Graph showing professional positions

In terms of improving the CERN Courier website, there was demand for a mobile-optimised platform and for video content, though a number of respondents were unaware that the website even existed. Importantly for the future of CERN Courier, a majority of readers (63%) indicated that they were willing to receive fewer print copies per year if preview-courier.web.cern.ch was updated more regularly; this trend was sharpest in the under-30 age group.

When it comes to the technical level of the articles, which is a topic of much consideration at the Courier, the responses indicate that the level is pitched just right (though, clearly, a number of readers will find some topics tougher than others given the range of subfields within high-energy physics). Readers also felt that their fields were well represented, and agreed that more articles about careers and people would be of interest.

Graph showing professional interests

Many written comments were provided, a few of which are listed here: “More investigative articles please”; “I would like that it has a little glossary”; “A column about people themselves, not only the physics they do”; “More debate on topics on which there is discussion in the field”; “Please do NOT modify CERN Courier into a ‘posher’ version”; “Leave out group photos of people at big meetings”; and “Make a CERN Courier kids edition”. The overwhelming majority of comments were positive, and the few that weren’t stood out: “The whole magazine reads like propaganda for CERN and for the Standard Model”; “The Courier style is intentionally humourless, frigid, stale and boring. Accordingly, almost everybody agrees that the obituaries are by far its best part”; and, curiously, “The actual format is so boring that I stop to read it!”

It only remains to thank participants of the survey and to congratulate the winners of our random prize draw (V Boudry, J Baeza, S Clawson, V Lardans and M Calvetti), who each receive a branded CERN hoodie.

LHCb constrains ultra-rare muonic B decay

A report from the LHCb experiment

Measurements of b-hadron decays with neutrinos in the final state are one of the best ways to understand how quarks decay, and in particular how they couple to leptons. With recent results from LHCb, BaBar and Belle raising questions about whether the Standard Model (with its assumption of lepton-flavour universality) is able to explain these couplings fully, further experimental results are needed.

At first glance, measuring fully leptonic decays such as B→ τ+ντ and B→ μ+νμ seems a step too far, since there is only one charged particle as a signature and no reconstructed B-decay vertex.

However, studying these decays is notoriously tricky at a hadron collider, where the busy collision environment makes it challenging to control the background. Despite this, the LHCb collaboration has made unexpected progress in this area over the last few years, with a comparison of decays with taus and muons, and measurements the CKM element ratio |Vub/Vcb| that originally seemed impossible.

At first glance, measuring fully leptonic decays such as B+ τ+ντ and B+ μ+νμ seems a step too far, since there is only one charged particle as a signature and no reconstructed B-decay vertex. The key to accessing these processes is to allow additional particles to be radiated, while preserving the underlying decay amplitude. The decay B+ μ+μμ+νμ is a good example of this, where a hard photon is radiated and converts immediately into two additional muons. Such a signature is significantly more appealing experimentally: there is a vertex to reconstruct and the background is low, as there are not many B decays that produce three muons.

B decays with a well-defined vertex and only one missing neutrino are becoming LHCb’s “bread and butter” thanks to the so-called corrected mass technique. The idea behind the corrected mass is that if you are only missing one neutrino, then adding the momentum perpendicular to the B flight direction is enough to recover the B mass. This technique is only possible thanks to the precise vertex resolution provided by the LHCb’s innermost detector, the VELO. Using this technique, LHCb expects to have a very good sensitivity for this decay, at a branching fraction level of 2.8 × 10−8 (equivalent to around one in 40 million B+ decays) with the 2011–2016 data sample.

The LHCb collaboration searched for this decay using 5 fb–1 of data (see figure). The main backgrounds come from reconstructed muons that originate from different decays (“combinatorial”) or from hadrons misidentified as muons (“misidentified”). No evidence for the signal is seen and an upper limit on the branching fraction of 1.6 × 10−8 is set at a confidence level of 95%.

The figure also shows a projected signal expected from a recent Standard Model prediction, which is based on the vector meson dominance model. This prediction includes two contributions to the decay: one in which two muons originate from a photon, and another in which they originate from the annihilation of a hadron (such as ρ0 μ+μ or ω μ+μ). As can be seen, the data disfavour this prediction, which motivates further theoretical work to understand the discrepancy. The good sensitivity for this decay is encouraging, and raises interesting prospects for observing the signal with future datasets collected at the upgraded LHCb detector.

The deepest clean lab in the world

Deep in a mine in Greater Sudbury, Ontario, Canada, you will find the deepest flush toilets in the world. Four of them, actually, ensuring the comfort of the staff and users of SNOLAB, an underground clean lab with very low levels of background radiation that specialises in neutrino and dark-matter physics.

Toilets might not be the first thing that comes to mind when discussing a particle-physics laboratory, but they are one of numerous logistical considerations when hosting 60 people per day at a depth of 2 km for 10 hours at a time. SNOLAB is the world’s deepest cleanroom facility, a class-2000 cleanroom (see panel below) the size of a shopping mall situated in the operational Vale Creighton nickel mine. It is an expansion of the facility that hosted the Sudbury Neutrino Observatory (SNO), a large, heavy-water detector designed to detect neutrinos from the Sun. In 2001, SNO contributed to the discovery of neutrino oscillations, leading to the joint award of the 2015 Nobel Prize in Physics to SNO spokesperson Arthur B McDonald and Super-Kamiokande spokesperson Takaaki Kajita.

Initially, there were no plans to maintain the infrastructure beyond the timeline of SNO, which was just one experiment and not a designated research facility. However, following the success of the SNO experiment, there was increased interest in low-background detectors for neutrino and dark-matter studies.

Building on SNO’s success

The SNO collaboration was first formed in 1984, with the goal of solving the solar neutrino problem. This problem surfaced during the 1960s, when the Homestake experiment in the Homestake Mine at Lead, South Dakota, began looking for neutrinos created in the early stages of solar fusion. This experiment and its successors, using different target materials and technologies, consistently observed only 30–50% of the neutrinos predicted by the standard solar model. A seemingly small nuisance posed a large problem, which required a large-scale solution.

SNO used a 12 m-diameter spherical vessel containing 1000 tonnes of heavy water to count solar neutrino interactions. Canada had vast reserves of heavy water for use in its nuclear reactors, making it an ideal location for such a detector. The experiment also required an extreme level of cleanliness, so that the signals physicists were searching for would not be confused with background events coming from dust, for instance. The SNO collaboration also had to develop new techniques to measure the inherent radioactivity of their detector materials and the heavy water itself.

Using heavy water gave SNO the ability to observe three different neutrino reactions: one reaction could only happen with electron neutrinos; one was sensitive to all neutrino flavours (electron, muon and tau); and the third provided the directionality pointing back to the Sun. These three complementary interactions let the team test the hypothesis that solar neutrinos were changing flavour as they travelled to Earth. In contrast to previous experiments, this approach allowed SNO to make a measurement of the parameters describing neutrino oscillations that didn’t depend on solar models. SNO’s data confirmed what previous experiments had seen and also verified the predictions of theories, implying that neutrinos do indeed oscillate during their Sun–Earth journey. The experiment ran for seven years and produced 178 papers accumulating more than 275 authors.

In 2002, the Canadian community secured funding to create an extended underground laboratory with SNO as the starting point. Construction of SNOLAB’s underground facility was completed in 2009 and two years later the last experimental hall entered “cleanroom” operation. Some 30 letters of interest were received from different collaborations proposing potential experiments, helping to define the requirements of the new lab.

SNOLAB’s construction was made possible by capital funds totalling CAD$73 million, with more than half coming from the Canada Foundation for Innovation through the International Joint Venture programme. Instead of a single giant cavern, local company Redpath Mining excavated several small and two large halls to hold experiments. The smaller halls helped the engineers manage the enormous stress placed on the rock in larger underground cavities. Bolts 10 m long stabilise the rock in the ceilings of the remaining large caverns, and throughout the lab the rock is covered with a 10 cm-thick layer of spray-on concrete for further stability, with an additional hand-troweled layer to help keep the walls dust-free. This latter task was carried out by Béton Projeté MAH, the same company that finished the bobsled track in the 2010 Vancouver Winter Olympics.

In addition to the experimental halls, SNOLAB is equipped with a chemistry laboratory, a machine shop, storage areas, and a lunchroom. Since the SNO experiment was still running when new tunnels and caverns were excavated, the connection between the new space and the original clean lab area was completed late in the project. The dark-matter experiments DEAP-1 and PICASSO were also already running in the SNO areas before construction of SNOLAB was completed.

Dark matter, neutrinos, and more

Today, SNOLAB employs a staff of over 100 people, working on engineering design, construction, installation, technical support and operations. In addition to providing expert and local support to the experiments, SNOLAB research scientists undertake research in their own right as members of the collaborations.

With so much additional space, SNOLAB’s physics programme has expanded greatly during the past seven years. SNO has evolved into SNO+, in which a liquid scintillator replaces the heavy water to increase the detector’s sensitivity. The scintillator will be doped with tellurium, making SNO+ sensitive to the hypothetical process of neutrinoless double-beta decay. Two of tellurium’s natural isotopes (128Te and 130Te) are known to undergo conventional double-beta decay, making them good candidates to search for the long-sought neutrinoless version. Detecting this decay would violate lepton-number conservation, proving that the neutrino is its own antiparticle (a Majorana particle). SNO+ is one of several experiments currently hunting this process down.

Another active SNOLAB experiment is the Helium and Lead Observatory (HALO), which uses 76 tons of lead blocks instrumented with 128 helium-3 neutron detectors to capture the intense neutrino flux generated when the core of a star collapses at the early stages of a supernova. Together with similar detectors around the world, HALO is part of a supernova early-warning system, which allows astronomers to orient their instruments to observe the phenomenon before it is visible in the sky.

With no fewer than six active projects, dark-matter searches comprise a large fraction of SNOLAB’s physics programme. Many different technologies are employed to search for the dark-matter candidate of choice: the weakly interacting massive particle (WIMP). The PICASSO and COUPP collaborations were both using bubble chambers to search for WIMPS, and merged into the very successful PICO project. Through successive improvements, PICO has endeavoured to enhance the sensitivity to WIMP spin-dependent interactions by an order of magnitude every couple of years. Its sensitivity is best for WIMP masses around 20 GeV/c2. Currently the PICO collaboration is developing a much larger version with up to 500 litres of active-mass material.

DEAP-3600, successor to DEAP-1, is one of the biggest dark-matter detectors ever built, and it has been taking data for almost two years now. It seeks to detect spin-independent interactions between WIMPs and 3300 kg of liquid argon contained in a 1.7 m-diameter acrylic vessel. The best sensitivity will be achieved for a WIMP mass of 100 GeV/c2. Using a different technology, the DAMIC (Dark Matter In CCDs) experiment employs CCD sensors, which have low intrinsic noise levels, and is sensitive to WIMP masses as low as 1 GeV/c2.

Although the science at SNOLAB primarily focuses on neutrinos and dark matter, the low-background underground environment is also useful for biology experiments. REPAIR explores how low radiation levels affect cell development and repair from DNA damage. One hypothesis is that removing background radiation may be detrimental to living systems. REPAIR can help determine whether this hypothesis is correct and characterise any negative impacts. Another experiment, FLAME, studies the effect of prolonged time spent underground on living organisms using fruit flies as a model. The findings from this research could be used by mining companies to support
a healthier workforce.

Future research

There are many exciting new experiments under construction at SNOLAB, including several dark-matter experiments. While the PICO experiment is increasing its detector mass, other experiments are using several different technologies to cover a wide range of possible WIMP masses. The SuperCDMS experiment and CUTE test facility use solid-state silicon and germanium detectors kept at temperatures near absolute zero to search for dark matter, while the NEWS-G experiment will use gasses such as hydrogen, helium and neon in a 1.4 m-diameter copper sphere.

SNOLAB still has space available for additional experiments requiring a deep underground cleanroom environment. The Cryopit, the largest remaining cavern, will be used for a next-generation double-beta-decay experiment. Additional spaces outside the large experimental halls can host several small-scale experiments. While the results of today’s experiments will influence future detectors and detector technologies, the astroparticle physics community will continue to demand clean underground facilities to host the world’s most sensitive detectors. From an underground cavern carved out to host a novel neutrino detector to the deepest cleanroom facility in the world, SNOLAB will continue to seek out and host world-class physics experiments to unravel some of the universe’s deepest mysteries.

Exploring how antimatter falls

Two new experiments at CERN, ALPHA-g and GBAR, have begun campaigns to check whether antimatter falls under gravity at the same rate as matter.

The gravitational behaviour of antimatter has never been directly probed, though indirect measurements have set limits on the deviation from standard gravity at the level of 10–6 (CERN Courier January/February 2017 p39). Detecting even a slight difference between the behaviour of antimatter and matter with respect to gravity would mean that Einstein’s equivalence principle is not perfect and could have major implications for a quantum theory of gravity.

ALPHA-g, a close model of the ALPHA experiment, combines antiprotons from CERN’s Antiproton Decelerator (AD) with positrons from a sodium-22 source and traps the resulting antihydrogen atoms in a vertical magnetic trap about 2 m tall. To measure their free-fall, the field is switched off so that the atoms fall under gravity and the position where the antiatoms annihilate with normal matter allows the rate to be determined precisely.

GBAR adopts a similar approach but takes antiprotons from the new and lower-energy ELENA ring attached to the AD (CERN Courier December 2016 p16) and combines them with positrons from a small linear accelerator to make antihydrogen ions. Once a laser has stripped all but one positron, the neutral antiatoms will be released from the trap and allowed to fall from a height of 20 cm.

ALPHA-g began taking beam on 30 October, while ELENA has been delivering beam to GBAR since the summer, allowing the collaboration to perfect the beam-delivery system. Both experiments are being commissioned before CERN’s accelerators are shut down on 10 December for a two-year period. The ALPHA-g team hopes to be able to gather enough data during this short period to make a first measurement of antihydrogen in free fall, while the brand new GBAR experiment aims to make a first measurement when antiprotons are back in the machine in 2021. A third experiment at the AD hall, AEgIS, which has been in operation for several years, is also measuring the effect of gravity on antihydrogen using yet another approach, based on a beam of antihydrogen atoms. AEgIS is also hoping to produce its first antihydrogen atoms this year.

So far, most efforts at the AD have focused on looking for charge–parity–time violation by studying the spectroscopy of antihydrogen and comparing it with that of hydrogen (CERN Courier March 2018 p30). This latest round of experiments opens a new avenue in antimatter exploration.

The tale of a billion-trillion protons

Before being smashed into matter at high energies to study nature’s basic laws, protons at CERN begin their journey rather uneventfully, in a bottle of hydrogen gas. The protons are separated by injecting the gas into the cylinder of an ion source and making an electrical discharge, after which they enter what has become the workhorse of CERN’s proton production for the past 40 years: a 36 m-long linear accelerator called Linac2. Here, the protons are accelerated to an energy of 50 MeV, reaching approximately one-third of the speed of light, ready to be injected into the first of CERN’s circular machines: the Proton Synchrotron Booster (PSB), followed by the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS). At each stage of the chain, they may end up driving fixed-target experiments, generating exotic beams in the ISOLDE facility, or being injected into the Large Hadron Collider (LHC) to be accelerated to the highest energies.

Situated at ground level on the main CERN site, Linac2 has delivered all of the protons for the CERN accelerator complex since 1978. Construction of Linac2 started in December 1973, and the first 50 MeV beam was obtained on 6 September 1978. Within a month, the design current of 150 mA was reached and the first injection tests in the PSB started. Routine operation of the PSB started soon afterwards, in December 1978. As proudly announced by CERN at the time, Linac2 was completed on budget and on schedule, for an overall cost of 23 million Swiss francs.

Linac2 is the machine that started more than a billion-trillion protons on trajectories that led to discoveries including the W and Z bosons, the creation of antihydrogen and the completion of the long search for the Higgs boson. On 12 November, Linac2 was switched off and will now be decommissioned as part of a major upgrade to the laboratory’s accelerator complex (CERN Courier October 2017 p32). Its design, operation and performance have been key factors in the success of CERN’s scientific programme and paved the way to its successor, Linac4, which will take over the task of producing CERN’s protons from 2020.

The decision to build Linac2 was taken in October 1973, with the aim to provide a higher-intensity proton beam compared to the existing Linac1 machine. Linac1 had been the original injector both to the PS when it began service in 1959, and to its booster (the PSB) when it was added to the chain in 1972. However, Linac1 was limited in the intensity it could provide, and the only way to higher intensity was for an entirely new construction.

Forward thinking

Linac2’s design parameters were chosen to comfortably exceed the nominal PSB requirements, providing a safety margin during operation and for future upgrades. Furthermore, it was decided to install the linac in a new building parallel to the Linac1 location instead of in the Linac1 tunnel. This avoided a long shut-down for installation and commissioning, and ensured that Linac1 was available as a back-up during the first years of Linac2 operation.

Linac2’s proton source was originally a huge 750 kV Cockcroft–Walton generator located in a shielded room, separate from the accelerator hall (figure 1), which provided the pre-acceleration to the entrance of the 4 m-long low energy beam transport line (LEBT). This transport line included a bunching system made of three RF cavities, after which protons were fed to the main accelerator: a drift-tube linac (DTL) that had many improvements with respect to the Linac1 design and became a standard for linacs at the time. The three accelerating RF “tanks”, increasing the beam energy up to 10.3, 30.5 and 50 MeV, respectively, with a total length of 33.3 m, were made of mild steel co-laminated with a copper sheet, with the vacuum and RF sealing provided by aluminium wire joints.

The RF system is of prime importance for the performance of linear accelerators. For Linac2, the amplifiers had to provide a total RF power of 7.5 MW just to accelerate the beam. The RF amplifiers were based on the Linac1 design principles, with larger diameters in order to safely deliver the higher power, and the RF tube was the same triode already used for most of the Linac 1 amplifiers.

The most significant upgrade to Linac2, which took place during the 1992/1993 shutdown, was the replacement of the 750 kV Cockcroft–Walton generator and of the LEBT with a new RF quadrupole (RFQ) only 1.8 m long, capable of bunching, focusing and accelerating the beam in the same RF structure. The RFQ was a new invention of the early 1980s that was immediately adopted at CERN: after the successful construction of a prototype RFQ for Linac1 (which at the time was still in service), the development of a record-breaking high-intensity RFQ for Linac2, capable of delivering to the DTL a current of 200 mA, started in 1984. The prototype high-current RFQ was commissioned on a test stand in 1989, and the replacement of the Linac2 pre-injector was officially approved in 1990.

Gearing up for the LHC

The main motivation for the higher current of Linac2 was to prepare the CERN injectors for the LHC, which was already in progress. It was clear that the LHC would require unprecedented beam brightness (intensity per emittance) from the injector chain, and one of the options considered was to go to single-turn injection into the PSB of a high-current linac beam to minimise emittance growth. This, in turn, required the highest achievable current from the linac. Another motivation for the replacement was the simpler operation and maintenance of the smaller RFQ compared with the large Cockcroft–Walton installation.

Construction of the new RFQ (figure 2) started soon after approval, and the new “RFQ2” system was installed at Linac2 during the normal shut-down in 1992/1993. Commissioning of the RFQ2 with Linac2 took a few weeks, and the 1993 physics run started with the new injector. Reaching the full design performance of the RFQ took a few years, mainly due to the slow cleaning of the surfaces that at first limited the peak RF fields possible inside the cavity. After the optics in the long transfer line were modified, the goal of 180 mA delivered to the PSB was achieved in 1998 – and this still ranks as the highest intensity proton beam ever achieved from a linac.

Throughout its life, Linac2 has undergone many upgrades to its subsystems, including major renovations of the control systems in 1993 and 2012, the exchange of more than half its magnet power supplies to more modern units (although a large number were still the same ones installed in the 1970s) and renovation of the RFQ and vacuum-control systems. Nevertheless, at its core, the three DTL RF cavities that are the backbone of the linac remained unchanged since their construction, as were more than 120 electromagnetic quadrupoles sealed in the drift tubes that have each pulsed more than 700 million times without a single magnet failure (figure 3).

Despite the performance and reliability of Linac2, the performance bottleneck of the injection chain for the LHC moved to the injection process of the PSB, which could only be resolved with a higher injection energy. This meant increasing the energy of the linac. At the time this was being considered, around a decade ago, Linac2 was already reaching 30 years of operation, and basing a new injector on it would have required a major consolidation effort. So the decision was made to move to a new accelerator called Linac4 (the name Linac3 is taken by an existing CERN linac that produces ions), which meant a clean slate for its design. Linac4 (figure 4) not only injects into the PSB at the higher energy of 160 MeV, but also switches to negative hydrogen-ion beam acceleration, which allows higher intensities to be accumulated in the PSB after removing the excess electrons.

As was the case when Linac2 took over from Linac1, Linac4 has been built in its own tunnel, allowing construction and commissioning to take place in parallel to the operation of Linac2 for the LHC (CERN Courier January/February 2018 p19). In connecting Linac4 to the PSB, some of the Linac2 transfer line will be dismantled to make space for additional shielding. But the original source, RFQ and three DTL cavities will remain in place for now – even if there is no possibility of their serving as a back-up once the change to Linac4 is made. As for the future of Linac2, hopefully you might one day be able to find part of the accelerator on display somewhere on the CERN site, so that its place in history is not forgotten.

Fixing gender in theory

Improving the participation of under-represented groups in science is not just the right thing to do morally. Science benefits from a community that approaches problems in a variety of different ways, and there is evidence that teams with mixed perspectives increase productivity. Moreover, many countries face a skills gap that can only be addressed by training more scientists, drawing from a broader pool of talent that cannot reasonably exclude half the population.

In the high-energy theory (HET) community, where creativity and originality are so important, the problem is particularly acute. Many of the breakthroughs in theoretical physics have come from people who think “differently”, yet the community does not acknowledge that being both mostly male and white encourages groupthink and lack of originality.

The gender imbalance in physics is well documented. Data from the American Physical Society and the UK Institute of Physics indicate that around 20% of the physics-research community is female, and the situation deteriorates significantly as one looks higher on the career ladder. By contrast, the percentage of females is higher in astronomy and the number of women at senior levels in astronomy has increased quite rapidly over the last decade.

However, research into gender in science often misses issues specific to particular disciplines such as HET. While many previous studies have explored challenges faced by women in physics, theory has not specifically been targeted, even though the representation of women is anomalously low.

In 2012, a group of string theorists in Europe launched a COST (European Cooperation in Science and Technology) action with a focus on gender in high-energy theory. Less than 10% of string theorists are female, and, worryingly, postdoc-application data in Europe show that the percentage of female early-career researchers has not changed significantly over the past 15 years.

The COST initiative enabled qualitative surveys and the collection of quantitative data. We found some evidence that women PhD students are less likely to continue onto postdoctoral positions than male ones, although further data are needed to confirm this point. The data also indicate that the percentage of women at senior levels (e.g. heads of institutes) is extremely low, less than 5%. Qualitative data raised issues specific to HET, including the need for mobility for many years before getting a permanent position and the long working hours, which are above average even for academics. A series of COST meetings also provided opportunities for women in string theory to network and to discuss the challenges that they face.

Following the conclusion of the COST action in 2017, women from the string theory community obtained support to continue the initiative, now broadened to the whole of the HET community. “GenHET” is a permanent working group hosted by the CERN theory department whose goals are to increase awareness of gender issues, improve the presence of women in decision-making roles, and provide networking, support and mentoring for women, particularly during their early career.

GenHET’s first workshop on high-energy theory and gender was hosted by CERN in September, bringing together physicists, social scientists and diversity professionals (see Faces and Places). Further meetings are planned, and the GenHET group is also developing a web resource that will collect research and reports on gender and science, advertise activities and jobs, and offer  advice on evidence-based practice for supporting women. GenHET aims to propose concrete actions, for example encouraging the community to implement codes of conduct at conferences, and all members of the HET community are welcome to join the group.

Diversity is about much more than gender: in the HET community, there is also under-representation of people of colour and LGBTQ+ researchers, as well as those who are disabled, carers, come from less privileged socio-economic backgrounds, and so on. GenHET will work in collaboration with networks focusing on other diversity characteristics to help improve this situation, turning the high-energy theory community into one that truly reflects all of society.

CMS weighs in on flavour anomalies

A report from the CMS experiment

Recent results from LHCb and other experiments appear to challenge the assumption of lepton-flavour universality. To explore further, the CMS collaboration has recently conducted a new search probing one of the theories that attempts to explain these flavour “anomalies”. Using 77.3 fb–1 of proton–proton collision data recorded in 2016 and 2017 at a centre-of-mass energy of 13 TeV, the CMS analysis is the first dedicated search for a neutral gauge boson with specific properties that couples only to leptons of the second and third family.

Although the Standard Model (SM) has been successful in describing current experimental results, it is generally believed to be incomplete. It cannot, for example, explain dark matter or the observed asymmetry between matter and antimatter in the universe. There are also several smaller differences between the experiment and the SM prediction that have been building up over the last few years. One set of intriguing anomalies has been reported by LHCb and other dedicated B-physics experiments, indicating a possible lepton-flavour universality violation in B-meson decays (CERN Courier April 2018 p23). Another is the long-standing tension in the measurement of the anomalous magnetic moment of the muon, for which an updated measurement is eagerly awaited (CERN Courier September 2018 p9).

One extension to the SM that has been proposed to explain these anomalies is an enlarged SM gauge group with an additional U(1) symmetry. Spontaneous breaking of this symmetry leads to the prediction of a new massive gauge boson, Zʹ. To keep the extended gauge symmetry free from quantum anomalies, only certain generation-dependent couplings are allowed. The model investigated by CMS promotes the difference in lepton numbers between the second and third generation to a local gauge symmetry, and until now has only been constrained slightly by experiment. Since the predicted Zʹ boson only couples to second- and third-generation leptons, the only way to produce it at the LHC is as final-state radiation off one of these leptons. The ideal source of muons for the purposes of this search is the decay of the SM Z boson to two muons, which can be measured with excellent mass resolution (~1%) in CMS. If a Zʹ boson exists, it will be radiated by one of the muons and decay subsequently to another pair of muons, leading to a final state with four muons.

Such a final state is also produced by a rare SM Z-boson decay to four muons mediated by an off-shell photon. The first observation of this rare decay of the SM Z boson in proton–proton collisions was reported by CMS in 2012. In order to reduce this background, the search exploits the resonant character of the new gauge boson’s di-muon decay. Events are selected that contain at least four muons with an invariant mass near the SM Z-boson mass. Di-muon candidates are then formed from muon pairs of opposite sign and a peak in their invariant mass distribution is sought, which would indicate the presence of a Zʹ particle.

The event yields are found to be consistent with the SM predictions (figure 1). Upper limits of the order of 10−8–10−7 are set on the branching fraction of a Z boson decaying to two muons and a Zʹ, with the latter also decaying into two muons, as a function of the Zʹ mass. This can be interpreted as a limit on the Zʹ particle’s coupling strength to muons, and provides the first dedicated limits on these Zʹ models at the LHC. Compared to other experiments and to indirect limits from the LHC obtained at lower centre-of-mass energies during Run 1, this search excludes a significant portion of parameter space favoured by the B-physics anomalies (figure 2). The analysis demonstrates the power and flexibility of the CMS experiment to adapt to and test new incoming physics models, which in turn react to previous experimental results, showing that experiments and theory go hand-in-hand.

bright-rec iop pub iop-science physcis connect