Comsol -leaderboard other pages

Topics

J-PET’s plastic revolution

The J-PET detector

It is some 60 years since the conception of positron emission tomography (PET), which revolutionised the imaging of physiological and biochemical processes. Today, PET scanners are used around the world, in particular providing quantitative and 3D images for early-stage cancer detection and for maximising the effectiveness of radiation therapies. Some of the first PET images were recorded at CERN in the late 1970s, when physicists Alan Jeavons and David Townsend used the technique to image a mouse. While the principle of PET already existed, the detectors and algorithms developed at CERN made a major contribution to its development. Techniques from high-energy physics could now be about to enable another leap in PET technology.

In a typical PET scan, a patient is administered with a radioactive solution that concentrates in malignant cancers. Positrons from β+ decay annihilate with electrons from the body, resulting in the back-to-back emission of two 511 keV gamma rays that are registered in a crystal via the photoelectric effect. These signals are then used to reconstruct an image. Significant advances in PET imaging have taken place in the past few decades, and the vast majority of existing scanners use inorganic crystals – usually bismuth germanium oxide (BGO) or lutetium yttrium orthosilicate (LYSO) – organised in a ring to detect the emitted PET photons.

The main advantage of crystal detectors is their large stopping power, high probability of photoelectric conversion and good energy resolution. However, the use of inorganic crystals is expensive, limiting the number of medical facilities equipped with PET scanners. Moreover, conventional detectors are limited in their axial field of view: currently a distance of only about 20 cm along the body can be simultaneously examined from a single-bed position, meaning that several overlapping bed positions are needed to carry out a whole-body scan, and only 1% of quanta emitted from a patient’s body are collected. Extension of the scanned region from around 20 to 200 cm would not only improve the sensitivity and signal-to-noise ratio, but also reduce the radiation dose needed for a whole-body scan.

To address this challenge, several different designs for whole-body scanners have been introduced based on resistive-plate chambers, straw tubes and alternative crystal scintillators. In 2009, particle physicist Paweł Moskal of Jagiellonian University in Kraków, Poland, introduced a system that uses inexpensive plastic scintillators instead of inorganic ones for detecting photons in PET systems. Called the Jagiellonian PET (J-PET) detector, and based on technologies already employed in the ATLAS, LHCb, KLOE, COSY-11 and other particle-physics experiments, the aim is to allow cost effective whole-body PET imaging.

Whole-body imaging

The current J-PET setup comprises a ring of 192 detection modules axially arranged in three layers as a barrel-shaped detector and the construction is based on 17 patent-protected solutions. Each module consists of a 500 × 19 × 7 mm3 scintillator strip made of a commercially available material called EJ-230, with a photomultiplier tube (PMT) connected at each side. Photons are registered via the Compton effect and each analog signal from the PMTs is sampled in the voltage domain at four thresholds by dedicated field-programmable gate arrays.

In addition to recording the location and time of the electron—positron annihilation, J-PET determines the energy deposited by annihilation photons. The 2D position of a hit is known from the scintillator position, while the third space component is calculated from the time difference of signals arriving at both ends of scintillator, enabling direct 3D image reconstruction. PMTs connected to both sides of the scintillator strips compensate for the low detection efficiency of plastic compared to crystal scintillators and enable multi-layer detection. A modular and relatively easy to transport PET scanner with a non-magnetic and low density central part can be used as a magnetic resonance imaging (MRI) or computed-tomography compatible insert. Furthermore, since plastic scintillators are produced in various shapes, the J-PET approach can be also introduced for positron emission mammography (PEM) and as a range monitor for hadron therapy.

The J-PET detector offers a powerful new tool to test fundamental symmetries

J-PET can also build images from positronium (a bound state of electron and positron) that gets trapped in intermolecular voids. In about 40% of cases, positrons injected into the human body create positronium with a certain lifetime and other environmentally sensitive properties. Currently this information is neither recorded nor used for PET imaging, but recent J-PET measurements of the positronium lifetime in normal and cancer skin cells indicate that the properties of positronium may be used as diagnostic indicators for cancer therapy. Medical doctors are excited by the avenues opened by J-PET. These include a larger axial view (e.g. to check correlations between organs separated by more than 20  cm in the axial direction), the possibility of performing combined PET-MRI imaging at the same time and place, and the possibility of simultaneous PET and positronium (morphometric) imaging paving the way for in vivo determination of cancer malignancy.

Such a large detector is not only potentially useful for medical applications. It can also be used in materials science, where PALS enables the study of voids and defects in solids, while precise measurements of positronium atoms leads to morphometric imaging and physics studies. In this latter regard, the J-PET detector offers a powerful new tool to test fundamental symmetries.

Combinations of discrete symmetries (charge conjugation C, parity P, and time reversal T) play a key role in explaining the observed matter–antimatter asymmetry in the universe (CP violation) and are the starting point for all quantum field theories preserving Lorentz invariance, unitarity and locality (CPT symmetry). Positronium is a good system enabling a search for C, T, CP and CPT violation via angular correlations of annihilation quanta, while the positronium lifetime measurement can be used to separate the ortho- and para-positronium states (o-Ps and p-Ps). Such decays also offer the potential observation of gravitational quantum states, and are used to test Lorentz and CPT symmetry in the framework of the Standard Model Extension.

At J-PET, the following reaction chain is predominantly considered: 22Na 22Ne e+ νe, 22Ne 22Ne γ and e+e o-Ps 3γ annihilation. The detection of 1274 keV prompt γ emission from 22Ne de-excitation is the start signal for the positronium-lifetime measurement. Currently, tests of discrete symmetries and quantum entanglement of photons originating from the decay of positronium atoms are the main physics topics investigated by the J-PET group. The first data taking was conducted in 2016 and six data-taking campaigns have concluded with almost 1 PB of data. Physics studies are based on data collected with a point-like source placed in the centre of the detector and covered by a porous polymer to increase the probability of positronium formation. A test measurement with a source surrounded by an aluminium cylinder was also performed. The use of a cylindrical target (figure 1, left) allows researchers to separate in space the positronium formation and annihilation (cylinder wall) from the positron emission (source). Most recently, measurements by J-PET were also performed with a cylinder with the inner wall covered by the porous material.

Figure 1

The J-PET programme aims to beat the precision of previous measurements for C, CP and CPT symmetry tests in positronium, and to be the first to observe a potential T-symmetry violation. Tests of C symmetry, on the other hand, are conducted via searches for forbidden decays of the positronium triplet state (o-Ps) to 4γ and the singlet state (p-Ps) to 3γ. Tests of the other fundamental symmetries and their combinations will be performed by the measurement of the expectation values of symmetry-odd operators constructed using spin of o-Ps, momenta and polarisation vectors of photons originating from its annihilation (figure 1, right). The physical limit of such tests is expected at the level of about 10−9 due to photo–photon interaction, which is six orders of magnitude smaller than the present experimental limits (e.g. at the University of Tokyo and by the Gammasphere experiment).

Since J-PET is built of plastic scintillators, it provides an opportunity to determine the photon’s polarisation through the registration of primary and secondary Compton scatterings in the detector. This, in turn, enables the study of multi-partite entanglement of photons originating from the decays of positronium atoms. The survival of particular entanglement properties in the mixing scenario may make it possible to extract quantum information in the form of distinct entanglement features, e.g. from metabolic processes in human bodies.

Currently a new, fourth J-PET layer is under construction (figure 2), with a single unit of the layer comprising 13 plastic-scintillator strips. With a mass of about 2 kg per single detection unit, it is easy to transport and to build on-site a portable tomographic chamber whose radius can be adjusted for different purposes by using a given number of such units.

Figure 2

The J-PET group is a collaboration between several Polish institutions – Jagiellonian University, the National Centre for Nuclear Research Świerk and Maria Curie-Skłodowska University – as well as the University of Vienna and the National Laboratory in Frascati. The research is funded by the Polish National Centre for Research and Development, by the Polish Ministry of Science and Higher Education and by the Foundation for Polish Science. Although the general interest in improved quality of medical diagnosis was the first step towards this new detector for positron annihilation, today the basic-research programme is equally advanced. The only open question at J-PET is whether a high-resolution full human body tomographic image will be presented before the most precise test of one of nature’s fundamental symmetries.

Quantum thinking required

Cooling technology

The High-Luminosity Large Hadron Collider (HL-LHC), due to operate in around 2026, will require a computing capacity 50–100 times greater than currently exists. The big uncertainty in this number is largely due to the difficulty in knowing how well the code used in high-energy physics (HEP) can benefit from new, hyper-parallel computing architectures as they become available. Up to now, code modernisation is an area in which the HEP community has generally not fared too well.

We need to think differently to address the vast increase in computing requirements ahead. Before the Large Electron–Positron collider was launched in the 1980s, its computing challenges also seemed daunting; early predictions underestimated them by a factor of 100 or more. Fortunately, new consumer technology arrived and made scientific computing, hitherto dominated by expensive mainframes, suddenly more democratic and cheaper.

A similar story unfolded with the LHC, for which the predicted computing requirements were so large that IT planners offering their expert view were accused of sabotaging the project! This time, the technology that made it possible to meet these requirements was grid computing, conceived at the turn of the millennium and driven largely by the ingenuity of the HEP community.

Looking forward to the HL-LHC era, we again need to make sure the community is ready to exploit further revolutions in computing. Quantum computing is certainly one such technology on the horizon. Thanks to the visionary ideas of Feynman and others, the concept of quantum computing was popularised in the early 1980s. Since then, theorists have explored its mind-blowing possibilities, while engineers have struggled to produce reliable hardware to turn these ideas into reality.

Qubits are the basic units of quantum computing: thanks to quantum entanglement, n qubits can represent 2n different states on which the same calculation can be performed simultaneously. A quantum computer with 79 entangled qubits has an Avogadro number of states (about 1023); with 263 qubits, such a machine could represent as many concurrent states as there are protons in the universe; while an upgrade to 400 qubits could contain all the information encoded in the universe.

However, the road to unlocking this potential – even partially – is long and arduous. Measuring the quantum states that result from a computation can prove difficult, offsetting some of the potential gains. Also, since classical logic operations tend to destroy the entangled state, quantum computers require special reversible gates. The hunt has been on for almost 30 years for algorithms that could outperform their classical counterparts. Some have been found, but it seems clear that there will be no universal quantum computer on which we will be able to compile our C++ code and then magically run it faster. Instead, we will have to recast our algorithms and computing models for this brave new quantum world.

In terms of hardware, progress is steady but the prizes are still a long way off. The qubit entanglement in existing prototypes, even when cooled to the level of millikelvins, is easily lost and the qubit error rate is still painfully high. Nevertheless, a breakthrough in hardware could be achieved at any moment.

A few pioneers are already experimenting with HEP algorithms and simulations on quantum computers, with significant quantum-computing initiatives having been announced recently in both Europe and the US. In CERN openlab, we are now exploring these opportunities in collaboration with companies working in the quantum-computing field – kicking things off with a workshop at CERN in November (see below).

The HEP community has a proud tradition of being at the forefront of computing. It is therefore well placed to make significant contributions to the development of quantum computing – and stands to benefit greatly, if and when its enormous potential finally begins to be realised.

Nanoelectronics: Materials, Devices, Applications (2 volumes)

By R Puers, L Baldi, M Van de Voorde and S E van Nooten (editors)
Wiley–VCH

Nanoelectronics: Materials, Devices, Applications

This book aims to provide an overview of both present and emerging nanoelectronics devices, focusing on their numerous applications such as memories, logic circuits, power devices and sensors. It is one unit (in two volumes) of a complete series of books that are dedicated to nanoscience and nanotechnology, and their penetration in many different fields, ranging from human health, agriculture and food science, to energy production, environmental protection and metrology.

After an introduction about the semiconductor industry and its development, different kinds of devices are discussed. Specific chapters are also dedicated to new materials, device-characterisation techniques, smart manufacturing and advanced circuit design. Then, the many applications are covered, which also shows the emerging trends and economic factors influencing the progress of the nanoelectronics industry.

Since nanoelectronics is nowadays fundamental for any science and technology that requires communication and information processing, this book can be of interest to electronic engineers and applied physicists working with sensors and data-processing systems.

Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning

By Bob Coecke and Aleks Kissinger
Cambridge University Press

Picturing Quantum Processes

“This book is about telling the story of quantum theory entirely in terms of pictures,” declare the authors of this unusual book, in which quantum processes are explained using diagrams and an innovative method for presenting complex theories is set up. The book employs a unique formalism developed by the authors, which allows a more intuitive understanding of quantum features and eliminates complex calculations. As a result, knowledge of advanced mathematics is not required.

The entirely diagrammatic presentation of quantum theory proposed in this (bulky) volume is the result of 10 years of work and research carried out by the authors and their collaborators, uniting classical techniques in linear algebra and Hilbert spaces with cutting-edge developments in quantum computation and foundational QM.

An informal and entertaining style is adopted, which makes this book easily approachable by students at their first encounter with quantum theory. That said, it will probably appeal more to PhD students and researchers who are already familiar with the subject and are interested in looking at a different treatment of this matter. The text is also accompanied by a rich set of exercises.

Essential Quantum Mechanics for Electrical Engineers

By Peter Deák
Wiley–VCH

Essential Quantum Mechanics for Electrical Engineers

The most recent and upcoming developments of electronic devices for information technology are increasingly being based on physical phenomena that cannot be understood without some knowledge of quantum mechanics (QM). In the new hardware, switching happens at the level of single electrons and tunnelling effects are frequently used; in addition, the superposition of electron states is the foundation of quantum information processing. As a consequence, the study of QM, as well as informatics, is now being introduced in undergraduate electric and electronic engineering courses. However, there is still a lack of textbooks on this subject written specifically for such courses.

The aim of the author was to fill this gap and provide a concise book in which both the basic concepts of QM and its most relevant applications to electronics and information technologies are covered, making use of only the very essential mathematics.

The book starts off with classical electromagnetism and shows its limitations when it comes to describing the phenomena involved in modern electronics. More advanced concepts are then gradually introduced, from wave–particle duality to the mathematical construction used to describe the state of a particle and to predict its properties. The quantum well and tunnelling through a potential barrier are explained, followed by a few applications, including light-emitting diodes, infrared detectors, quantum cascade lasers, Zener diodes, flash memories and the scanning tunnelling microscope. Finally, the author discusses some of the consequences of QM for the chemical properties of atoms and other many-electron systems, such as semiconductors, as well as the potential hardware for quantum information processing.

Even though the mathematical formulation of basic concepts is introduced when required, the author’s approach is oriented at limiting calculations and abstraction in favour of practical applications. Applets, accessible on the internet, are also used as a support, to ease the computational work and quickly visualise the results.

Third Thoughts

By Steven Weinberg
The Belknap Press of Harvard University Press

Third Thoughts

When Nobel laureates offer their point of view, people generally are curious to listen. Self-described rationalist, realist, reductionist and devoutly secular, Steven Weinberg has published a new book reflecting on current affairs in science and beyond. In Third Thoughts, he addresses themes that are of interest for both laypeople and researchers, such as the public funding of science.

Weinberg shared the Nobel Prize in Physics in 1979 for unifying the weak interaction and electromagnetism into the electroweak theory, the core of the Standard Model, and has made many other significant contributions to physics. At the same time, Weinberg has been and remains a keen science populariser. Probably his most famous work is the popular-science book The First Three Minutes, where he recounts the evolution of the universe immediately following the Big Bang.

Third Thoughts is his third collection of essays for non-specialist readers, following Lake Views (2009) and Facing Up (2001). In it are 25 essays divided into four themes: science history, physics and cosmology, public matters, and personal matters. Some are the texts of speeches, some were published previously in The New York Review of Books, and others are released for the first time.

The essays span subjects from quantum mechanics to climate change, from broken symmetry to cemeteries in Texas, and are pleasantly interspersed with his personal life stories. Like his previous collections, Weinberg deals with topics that are dear to him: the history of science, science spending, and the big questions about the future of science and humanity.

The author defines himself as an enthusiastic amateur in the history of science, albeit a “Whig interpreter” (meaning that he evaluates past scientific discoveries by comparing them to the current advancements – a method that irks some historians). Beyond that, his taste for controversy encourages him to cogitate over Einstein’s lapses, Hawking’s views, the weaknesses of quantum mechanics and the US government’s financing choices, among others.

Readers who are interested in US politics will find the section “Public matters” very thought-provoking. In particular, the essay “The crisis of big science” is based on a talk he gave at the World Science Festival in 2011 and later published in the New York Review of Books. He explains the need for big scientific projects, and describes how both cosmology and particle physics are struggling for governmental support. Though still disappointed by the cut of the Superconducting Super Collider (SSC) in the early 1990s, he is excited by the new endeavours at CERN. He reiterates his frank opinions against manned space flight, and emphasises how some scientific obstacles are intertwined in the historical panorama. In this way, Weinberg sets the cancellation of the SSC in a wider problematic context, where education, healthcare, transportation and law enforcement are under threat.

The author condenses the essence of what physicists have learnt so far about the laws of nature and why science is important. This is a book about asking the right questions, when time is ripened to look for the answers. He explains that the question “What is the world made of?” needed to wait for chemistry advances at the end of the 18th century. “What is the structure of the electron?” needed to wait for quantum mechanics. While “What is an elementary particle?” is still waiting for an answer.

The essays vary in difficulty, and some concepts and views are repeated in several essays, thus each of them can be read independently. While most are digestible for readers without any background knowledge in particle physics, a general understanding of the Standard Model would help with grasping the content of some of the paragraphs. Having said that, the general reader can still follow the big picture and logically-argued thoughts.

Several essays talk about CERN. More specifically, the “The Higgs, and beyond” article was written before the announcement of the Higgs boson discovery in 2011, and briefly presents the possibility of technicolour forces. The following essay, “Why the Higgs?”, was commissioned just after the announcement in 2012 to explain “what all the fuss is about”.

One of the most curious essays to explore is number 24. Citing Weinberg: “Essay 24 has not been published until now because everyone who read it disagreed with it, but I am fond of it so bring it out here.” There, he draws parallels between his job as a theoretical physicist and the one of creative artists.

Not all scientists are able to write in such an unconstrained and accessible way. Despair, sorrow, frustration, doubt, uneasiness and wishes all emerge page after page, offering the reader the privilege of coming closer to one of the sharpest scientific minds of our era.

From Stars to States: A Manifest for Science in Society

By Thierry Courvoisier
Springer

From Stars to States

This book is a curiosity, but like many curiosities, well worth stumbling across. It is the product of a curious, roving mind with a long and illustrious career dedicated to the exploration of nature and the betterment of society. Pieced together with cool scientific logic, it takes the reader from a whistle-stop tour of modern astronomy through the poetry collection of Jocelyn Bell-Burnell, to a science-inspired manifesto for the future of our planet. After an opening chapter tracing the development of astronomy from the 1950s to now, subsequent chapters show how gazing at the stars, and learning from doing so, has brought benefit to people from antiquity to modern times across a wide range of disciplines.

Astronomy helped our ancestors to master time, plant crops at the right moment, and navigate their way across wide oceans. There’s humour in the form of speculation about the powers of persuasion of those who convinced the authorities of the day to build the great stone circles that dot the ancient world, allowing people to take time down from the heavens. These were perhaps the Large Hadron Colliders of their time, and, in Courvoisier’s view, probably took up a considerably larger fraction of ancient GDP (gross domestic product) than modern scientific instruments. John Harrison’s remarkable clocks are given pride of place in the author’s discussion of time, though the perhaps even more remarkable Antikythera mechanism is strangely absent.

By the time we reach chapter three, the beginnings of a virtuous circle linking basic science to technology and society are beginning to appear, and we can start to guess where Courvoisier is taking us. The author is not only an emeritus professor of astronomy at the University of Geneva, but also a former president of the Swiss Academy of Sciences and current president of EASAC, the European Academies Science Advisory Council. For good measure, he is also president of the H Dudley Wright Foundation, a charitable organisation that supports science communication activities, mainly in French-speaking Switzerland. He is, in short, a living, breathing link between science and society.

In chapter four, we enjoy the cultural benefits of science and the pleasure of knowledge for its own sake. We have a glimpse of what in Swiss German is delightfully referred to as Aha Erlebnis – that eureka moment when ideas just fall into place. It reminded me of the passage in another curious book, Kary Mullis’s Dancing Naked in the Mindfield, in which Mullis describes the Aha Erlebnis that led to him receiving the Nobel Prize in Chemistry in 1993. It apparently came to him so strongly out of the blue on a night drive along a California freeway that he had to pull off the road and write it down. Einstein’s famous 1% inspiration may be rare, but what a wonderful thing it is when it happens.

Chapter five begins the call to action for scientists to take up the role that their field demands of them in society. “We still need to generate the culture required to […] bring existing knowledge to places where it can and must contribute to actions fashioning the world.” Courvoisier examines the gulf between the rational world of science and the rather different world of policy – a gulf once memorably described by Lew Korwarski in his description of the alliance between scientists and diplomats that led to the creation of CERN. “It was a pleasure to watch the diplomats grapple with the difference between a cyclotron and a plutonium atom,” he said. “We had to compensate by learning how to tell a subcommittee from a working party, and how – in the heat of a discussion – to address people by their titles rather than their names. Each side began to understand the other’s problems and techniques; a mutual respect grew in place of the traditional mistrust between egg-headed pedants and pettifogging hair-splitters.” CERN is the resulting evidence for the good that comes when science and policy come together.

As we reach the business end of the book, we find a rallying call for strengthening our global institutions, and here another of Courvoisier’s influences comes to the fore. He’s Swiss, and a scientist. Scientists have long understood the benefits of collaboration, and if there is one country in the world that has managed to reconcile the nationalism of its regions with the greater need of the supra-cantonal entity of the country as a whole, it is Switzerland. It would be a gross oversimplification to say that Courvoisier’s manifesto is to apply the Swiss model to global governance, but you get the idea.

Originally published in French by the Geneva publisher Georg, if there’s one criticism I have of the book, it’s the translation. It made Catherine Bréchignac, who speaks with fluidity in French, come across as rather clunky in her introduction, and on more than one occasion I found myself wondering if the words I was reading were really expressing what the author wanted to say. Springer and the Swiss Academy of Sciences are to be lauded for bringing this manifesto to an Anglophone audience, but for those who read French, I’d recommend the original.

Particle interactions up to the highest energies

The 20th International Symposium on Very High Energy Cosmic Ray Interactions (ISVHECRI 2018) was held in Nagoya, Japan, on 21–25 May. More than 120 attendees from 19 countries discussed various aspects of hadronic interactions at the intersection between high-energy cosmic-ray physics and classical accelerator-based particle physics. The 65 contributions reflected the large diversity and interdisciplinary character of this biennial series, which is held under the auspices of the International Union of Pure and Applied Physics.

In his opening address, Sunil Gupta paid a tribute to Oscar Saavedra, one of the leading scientists and founders of the ISVHECRI series, who passed away in 2018. Following the long tradition of this symposium series, the main topic was the discussion of particle physics of relevance to extensive air showers, secondary cosmic-ray production, and hadronic multi-particle production at accelerators. This time, the symposium expanded its coverage of multi-messenger astrophysics, especially to neutrino and gamma-ray astrophysics. Many talks were invited from the Pierre Auger Observatory and Telescope Array, as well as from IceCube, Super-Kamiokande, CTA and HAWC, and space-borne experiments such as AMS-02, Fermi and CALET.

Participants discussed how many open questions in high-energy astroparticle physics are related to our understanding of cosmic-ray interactions from the multi-messenger point of view; for example, the relevance of production and propagation of positrons or antimatter for indirect dark-matter searches, or of atmospheric-neutrino production for neutrino oscillations or neutrino astronomy.

Showcasing several models of high-energy cosmic-ray interactions, and their verification by accelerator measurements, was also a highlight of the symposium. The event offered a unique opportunity for developers of major cosmic-ray interaction models to gather and engage in valuable discussions. Other highlights were the talks about accelerator data relevant to cosmic-ray observations, reported by the teams behind CERN’s large LHC experiments as well as smaller fixed-target experiments such as NA61. Emphasis was put on forward measurements by ATLAS, CMS, LHCb and LHCf, including first results from the SMOG gas-jet target measurements of LHCb (see “Fixed-target physics in collider mode at LHCb“).

A public lecture, “Exploring the Invisible Universe” by Nobel Laureate Takaaki Kajita, attracted more than 250 participants, which was complemented by a tour of the nuclear emulsion lab of Nagoya University to see state-of-the-art emulsion technology. The progress in this technology was clearly visible when Edison Shibuya and others recalled the early days of studying cosmic rays with emulsion chambers and Saavedra’s related pioneering contributions.

There were many discussions on future studies of relevance to cosmic-ray interactions and astroparticle physics. Hans Dembinski discussed prospects in the near and far future in collider experiments, including possible proton–oxygen runs at the LHC and a study of multi-particle production at a future circular collider. The cosmic-ray community is very enthusiastic about a future proton–oxygen run since, even with a short run of 100 million events, charged particle and pion spectra could be measured to an accuracy of 10% – a five-fold improvement over current model uncertainties that would bring us a crucial step closer to unveiling the cosmic accelerators of the highest energy particles in the universe.

The next ISVHECRI will be held in June 2020 at Ooty, the location of the RAPES air-shower experiment in India.

AWAKE accelerates electrons in world first

AWAKE spectrometer signal

The AWAKE experiment at CERN has passed an important milestone towards compact, high-energy accelerators for applications in future high-energy physics experiments. Reporting in Nature on 29 August, the 18 institute-strong international collaboration has for the first time demonstrated the acceleration of electrons in a plasma wakefield generated by a proton beam. The AWAKE team injected electrons into plasma at an energy of around 19 MeV and, after travelling a distance of 10 m, the electrons emerged with an energy of about 2 GeV – representing an average acceleration gradient of around 200 MV/m. For comparison, radio-frequency (RF) cavities in high-energy linear accelerators used for X-ray free-electron lasers achieve typical gradients of a few tens of MV/m.

Plasma-wakefield acceleration still has far to go before it can rival the performance of conventional RF technology, however. First proposed in the late 1970s, the technique accelerates charged particles by forcing them to “surf” atop a longitudinal plasma wave that contains regions of positive and negative charges. Two beams are required: a “witness” beam, which is to be accelerated, and a “drive” beam that generates the wakefield. Initial experiments took place with laser and electron drive beams at SLAC and elsewhere in the 1990s, and the advent of high-power lasers as wakefield drivers led to increased activity. Such techniques are now capable of bringing electrons to energies of a few GeV over a distance of a few centimetres.

AWAKE (the Advanced Wakefield Experiment) is a proof-of-principle R&D project that is the first to use protons for the drive beam. Since protons penetrate deeper into the plasma than electrons and lasers, thereby accelerating witness beams for a greater distance, they potentially can accelerate electrons to much higher energies in a single plasma stage. The experiment is driven by a bunch of 400 GeV protons from the Super Proton Synchrotron, which is injected into a plasma cell containing rubidium gas at a temperature of around 200ºC. An accompanying laser pulse is used to ionise the rubidium gas and transform it into a plasma. As the proton bunch travels through the plasma, it splits into a series of smaller bunches via a process called self-modulation, generating a strong wakefield as they move. A bunch of witness electrons is then injected at an angle into this oscillating plasma at relatively low energies and rides the plasma wave to get accelerated. At the other end of the plasma, a dipole magnet bends the incoming electrons onto a scintillator to allow the energy of the outgoing particles to be measured (see figure).

AWAKE has made rapid progress since its inception in 2013. Following the installation of the plasma cell in early 2016, in the tunnel formerly used by part of the CNGS facility at CERN, a proton-driven wakefield in a plasma was observed for the first time by the end of the year (CERN Courier January/February 2017 p8). The electron source, electron beam line and electron spectrometer were installed during 2017, completing the preparatory phase beginning in 2018, and the first electron acceleration was recorded early in the morning of 26 May.

So far, the AWAKE demonstration involves low-intensity electron bunches; the next steps include plans to create an electron beam at high energy with sufficient quality to be useful for applications, although tests will pause at the end of the year when the CERN accelerator complex shuts down for two years for upgrades and maintenance. A first application of AWAKE is to deliver accelerated electrons to an experiment and extending the project with a fully-fledged physics programme of its own. For eventual collider experiments, another hurdle is to be able to accelerate positrons. In the longer term, a global effort is under way to develop wakefield-acceleration techniques for a multi-TeV linear collider (CERN Courier December 2017 p31).

Although still at an early stage of development, the use of plasma wakefields could drastically reduce the size and therefore cost of accelerators. Edda Gschwendtner, technical coordinator and CERN project leader for AWAKE, says that the ultimate aim is to attain an average acceleration gradient of around 1 GV/m so that electrons can be accelerated to the TeV scale in a single stage. “We are looking forward to obtaining more results from our experiment to demonstrate the scope of plasma wakefields as the basis for future particle accelerators.”

ALPHA takes antihydrogen to the next level

Antihydrogen 1S–2P spectral line shape

The ALPHA experiment at CERN’s Antiproton Decelerator (AD) has made yet another seminal measurement of the properties of antiatoms. Following its determination last year of both the ground-state hyperfine and the 1S–2S transitions in antihydrogen, the latter representing the most precise measurement of antimatter ever made (CERN Courier May 2018 p7), the collaboration has reported in Nature the first measurement of the next fundamental energy level: the Lyman-alpha transition. The result demonstrates that ALPHA is quickly and steadily paving the way for precision experiments that could uncover as yet unseen differences between the behaviour of matter and antimatter (CERN Courier March 2018 p30).

The Lyman-alpha (or 1S–2P) transition is one of several in the Lyman series that were discovered in atomic hydrogen just over a century ago. It corresponds to a wavelength of 121.6 nm and is a special transition in astronomy because it allows researchers to probe the state of the intergalactic medium. Finding any slight difference between such transitions in antimatter and matter would shake one of the foundations of quantum field theory, charge–parity–time (CPT) symmetry, and perhaps cast light on the observed cosmic imbalance of matter
and antimatter.

The ALPHA team makes antihydrogen atoms by taking antiprotons from the AD and binding them with positrons from a sodium-22 source, confining the resulting antihydrogen atoms in a magnetic trap. A laser is used to measure the antiatoms’ spectral response, requiring a range of laser frequencies and the ability to count the number of atoms that drop out of the trap as a result of interactions between the laser and the trapped atoms. Having successfully employed this technique to measure the 1S–2S transition, ALPHA has now measured the Lyman-alpha transition frequency with a precision of a few parts in a hundred million: 2,466,051.7 ± 0.12 GHz. The result agrees with the prediction for the equivalent transition hydrogen to a precision of 5 × 10–8.

Although the precision is not as high as that achieved in hydrogen, the finding represents a pivotal technological step towards laser cooling of antihydrogen and the extension of antimatter spectroscopy to quantum states possessing orbital angular momentum. Simulations indicate that cooling to about 20 mK is possible with the current ALPHA set-up, which, combined with other planned improvements, would reduce the 1S–2S transition line width (see figure) by more than an order of magnitude. At such levels of precision, says the team, antihydrogen spectroscopy will have an impact on the determination of fundamental constants, in addition to providing elegant tests of CPT symmetry. Laser cooling will also allow precision tests of the weak equivalence principle via antihydrogen free-fall or antiatom-interferometry experiments.

“The Lyman-alpha transition is notoriously difficult to probe – even in normal hydrogen”, says ALPHA spokesperson Jeffrey Hangst. “But by exploiting our ability to trap and hold large numbers of antihydrogen atoms for several hours, and using a pulsed source of Lyman-alpha laser light, we were able to observe this transition. Next up is laser cooling, which will be a game-changer for precision spectroscopy and gravitational measurements.”

bright-rec iop pub iop-science physcis connect