Comsol -leaderboard other pages

Topics

Ascent commemorates cosmic-ray pioneers

A hot-air balloon commemorating the discovery of cosmic rays

On 25 January, a muon detector, a particle physicist and a prizewinning pilot ascended 4000 m above the Swiss countryside in a hot-air balloon to commemorate the discovery of cosmic rays. The event was the highlight of the opening ceremony of the 42nd Château-d’Oex International Balloon Festival, attended by an estimated 30,000 people, and attracted significant media coverage.

In the early 1900s, following Becquerel’s discovery of radioactivity, studying radiation was all the rage. Portable electrometers were used to measure the ionisation of air in a variety of terrestrial environments, from fields and lakes to caves and mountains. With the idea that ionisation should decrease with altitude, pioneers adventured in balloon flights as early as 1909 to count the number of ions per cm3 of air as a function of altitude. First results indeed indicated a decrease up to 1300 m, but a subsequent ascent to 4500 m by Albert Gockel, professor of physics at Fribourg, concluded that ionisation does not decrease and possibly increases with altitude. Gockel, however, who later would coin the term “cosmic radiation”, was unable to obtain the hydrogen needed to go to higher altitudes. And so it fell to Austrian physicist Victor Hess to settle the case. Ascending to 5300 m in 1912, Hess clearly identified an increase, and went on to share the 1936 Nobel Prize in Physics for the discovery of cosmic rays. Gockel, who died in 1927, could not be awarded, and for that reason is almost forgotten by history.

ATLAS experimentalist Hans Peter Beck of the University of Bern, and a visiting professor at the University of Fribourg, along with two students from the University of Fribourg, reenacted Gockel’s and Hess’s pioneering flights using 21st-century technology: a muon telescope called the Cosmic Hunter, newly developed by instrumentation firm CAEN. The educational device, which counts coincidences in two scintillating-fibre tiles of 15 × 15 cm2 separated by 15 cm, verified that the flux of cosmic rays increases as a function of altitude. Within two hours of landing, including a one-hour drive back to the starting point, Beck was able to present the data plots during a public talk attended by more than 250 people. A second flight up to 6000 m is planned, with oxygen supplies for passengers, when weather conditions permit.

The view from inside the hot-air balloon

“Relating balloons with particle physics was an easy task, given the role balloons played in the early days for the discovery of cosmic rays,” says Beck. “It is a narrative that works and that touches people enormously, as the many reactions at the festival have shown.”

The event – a collaboration with the universities of Bern and Fribourg, the Swiss Physical Society, and the Jungfraujoch research station – ran in parallel to a special exhibition about cosmic rays at the local balloon museum, organised by Beck and Michael Hoch from CMS, which was the inspiration for festival organisers to make physics a focus of the event, says Beck: “Without this, the festival would never have had the idea to bring ‘adventure, science and freedom’ as this year’s theme. It’s really exceptional.”

AMS detector given a new lease of life

Checking the installation of the Upgraded Tracker Thermal Pump System for AMS

On 25 January, European Space Agency astronaut Luca Parmitano stepped outside a half-million-kilogramme structure travelling at tens of thousands of kilometres per hour, hundreds of kilometres above Earth, and, tethered by a thin cord, ventured into the vacuum of space to check for a leak.

It was the fourth such extravehicular activity (EVA) he’d been on in two months. All things considered, the task ahead was relatively straightforward: to make sure that a newly installed cooling system for the Alpha Magnetic Spectrometer (AMS), the cosmic-ray detector that has been attached to the International Space Station (ISS) since 2011, had been properly plumbed in.

Heart-stopping spacewalks

The first EVA on 15 November saw Parm­itano and fellow astronaut, NASA’s Drew Morgan, remove and jettison the AMS debris shield, which is currently still spiralling its way to Earth, to allow access to the experiment’s cooling system. The CO2 pumps, needed to keep the 200,000-channel tracker electronics at a temperature of 10 ± 3 °C, had started to fail in 2014 – which was no surprise, as AMS was initially only supposed to operate for three years. During the second EVA on 22 November, the astronauts cut through the cooling system’s eight stainless-steel lines to isolate and prepare it for removal, and a critical EVA3 on 2 December saw Morgan and Parmitano successfully connect the new pump system, which had been transported to the ISS by an Antares rocket the previous month. Then came a long wait until January to find out if the repair had been successful.

“EVA4 was the heart-stopping EVA because that’s where we did the leak tests on all those tubes,” says Ken Bollweg, NASA’s AMS project manager. The success of the previous EVAs suggested that the connections were going to be fine. But Parmitano arrived at the first tube, attached one of 29 bespoke tools developed specially for the AMS repair, and saw that the instrument had issued a warning signal. “I see red,” he reported to anxious teams at NASA’s Johnson Space Center’s Mission Control Center and the AMS Payload Operations Control Centre (POCC) at CERN’s Prévessin site, from where spokesperson Sam Ting and his colleagues were monitoring proceedings closely. Though not huge, the leak was serious enough not to guarantee that the system would work, jeopardising four years of preparation involving hundreds of astronauts, engineers and scientists. Following procedures put in place to deal with such a situation, Parmitano tightened the connection and waited for about an hour before checking the tube again. A leak was still present. Then, after re-tightening the troublesome connection again, while the team was preparing a risky “jumper” manoeuvre to bypass the leak and make a new connection, he checked a third time: “No red!” Happy faces lit up the POCC.

NASA has learned a lot of new things from this

Ken Bollweg

AMS was never designed to be service­able, and the repair, unprecedented in complexity for a space intervention, required the avoidance of sharp edges and other hazards in order to bring it back to full operational capacity. The chances of something going wrong were high, says Bollweg. “NASA has learned a lot of new things from this. We really pushed the envelope. It showed that we have the capabilities to do even more than we have done in the past.” EVA4 lasted almost six hours. Five hours and two minutes into it, Parmitano, who returned safely to Earth on 6 February, broke the European record for the most time spent spacewalking (33 hours and nine minutes). It’s not a job for the fainthearted. During a spacewalk in 2013, while wedged into a confined space outside the ISS, a malfunction in Parmitano’s spacesuit caused his helmet to start filling with water and he almost drowned.

“Building and operating AMS in space has been an incredible journey through engineering and physics, but today it is thanks to the NASA group that in AMS we can continue this journey and this is amazing. An enormous thanks to the EVA crew,” said AMS integration engineer Corrado Gargiulo of CERN. The day after EVA4, the POCC team spent about 10 hours refilling the new AMS cooling system with 1.3 kg of CO2 and started to power up the detector. At noon on 27 January, all the detector’s subsystems were sending data back, marking a new chapter for AMS that will see it operate for the lifetime of the ISS.

Into the unknown

The 7.5 tonne AMS apparatus has so far recorded almost 150 billion charged cosmic rays with energies up to the multi-TeV range, and its percent-level results show clear and unexpected behaviour of cosmic-ray events at high energies. A further 10 years of operation will allow AMS to make conclusive statements on the origin of these unexpected observations, says Ting. “NASA is to be congratulated on seeing this difficult project through over a period of many years. AMS has observed unique features in cosmic-ray spectra that defy traditional explanations. We’re entering into a region where nobody has been before.”

AMS has observed unique features in cosmic-ray spectra that defy traditional explanations

Sam Ting

The first major result from AMS came in 2013, when measurements of the cosmic positron fraction (the ratio of the posi­tron flux to the flux of electrons and positrons) up to an energy of 350 GeV showed that the spectrum fits well to dark-matter models. The following year, AMS published the positron and electron fluxes, which showed that neither can be fitted with the single-power-law assumption underpinning the traditional understanding of cosmic rays. The collaboration has continued to find previously unobserved features in the measured fluxes and flux ratio of electrons and positrons, publishing the results in several high-profile papers during the past couple of years.

Figure 1. The positron spectrum measured by AMS (yellow), showing that low-energy positrons mostly come from cosmic ray collisions (shaded area). Unexpectedly, there is a continuous excess starting at 25 GeV. The spectrum reaches a maximum at around 284 GeV followed by a sharp drop-off with a finite energy cutoff established at 99.99% confidence.
Figure 2. Comparison between 0.6 million antiprotons (blue, right axis) with 1.9 million positrons (yellow, left axis) using the latest AMS data.

Last year, AMS reaffirmed the complex energy dependence exhibited by the positron flux: a significant excess starting from 25 GeV, a sharp drop-off above 284 GeV and a finite energy cutoff at 810 GeV (figure 1). “In the entire energy range the positron flux is well described by the sum of a term associated with the positrons produced in the collision of cosmic rays, which dominates at low energies, and a new source term of positrons, which dominates at high energies,” says Ting. “These experimental data on cosmic-ray positrons show that, at high energies, they predominantly originate either from dark-matter annihilation or from other astrophysical sources.” Although dark-matter models predict such a cut off, the AMS data cannot yet rule out astrophysical sources, in particular pulsars. Further intrigue comes from the latest, to-be-published, AMS result on antiprotons, which, although rare at high energies, exhibit similar functional behaviour as the positron spectrum (figure 2). “This indicates that the excess of positrons may not come from pulsars due to the similarity of the two spectra and the high mass of antiprotons,” says Ting.

Thanks to the successful installation of the new AMS cooling system, the expected positron spectrum by 2028, in particular the high-energy data points, should enable an accurate comparison with dark-matter models (figure 3). High-energy (>TeV) events are also expected to provide insights into the origins of cosmic electrons, the latest results on which show that the electron flux exhibits a significant excess starting from 42 GeV.

Figure 3. Comparison between the projected positron spectrum (light blue) and the prediction from a dark-matter model (Phys. Rev. D 88 076013).
Figure 4. The electron spectrum (light blue points) fitted with the sum of two power laws (green curve) in the energy range 0.5–1400 GeV. The two power-law components a and b are represented by the grey and blue areas, respectively. The minute contribution of electrons from cosmic-ray collisions is also shown (green area).

Unlike the positron flux, which has an exponential energy cutoff at 810 GeV, the electron flux does not have a cutoff (at least not below 1.9 TeV). Also: in the entire energy range the electron flux is well described by the sum of two power law components (figure 4), providing “clear evidence”, says Ting, that most high energy electrons originate from different sources than high energy positrons.

Novelties in nuclei

Unexpected results continue to appear in data from cosmic nuclei, which make up the bulk of cosmic rays travelling through space. Helium, carbon and oxygen nuclei are thought to be mainly produced and accelerated in astrophysical sources and are known as primary cosmic rays, while lithium, beryllium and boron nuclei are produced by the collision of heavier nuclei with nuclei of the interstellar matter and are known as secondary cosmic rays.

New properties of primary cosmic rays – helium, carbon and oxygen – have been observed in the rigidity range 2 GV to 3 TV; at high energies these three spectra also have identical rigidity dependence, all deviating from a single power law above 200 GV. Similar oddities have appeared in measurements of secondary cosmic rays – lithium, beryllium and boron – in the range 1.9 GV to 3.3 TV (figure 5). The lithium and boron fluxes have an identical rigidity dependence above 7 GV, all three fluxes have an identical rigidity dependence above 30 GV, and, unexpectedly, above 30 GV the Li/Be flux ratio is approximately equal to two.

Figure 5. The rigidity dependences of the spectra of primary cosmic rays (helium, carbon and oxygen) compared to the spectra of secondary cosmic rays (lithium, beryllium and boron), all scaled to the helium flux at 30 GV.

The ratio of secondary fluxes to primary fluxes is particularly interesting because it directly measures the amount and properties of the interstellar medium. Before AMS, only the B/C ratio was measured and was assumed to be proportional to RΔ with Δ a constant for R > 60 GV. The latest AMS results on secondary (Li, Be, B) to primary (C, O) flux ratios show that Δ is not a constant, but changes by more than 5σ between the two rigidity ranges, 60 < R < 200 GV and 200 < R < 3300 GV. As with electron and positron fluxes, none of the current AMS results can be explained by existing theoretical models. By 2028, says Ting, AMS will extend its measurements of cosmic nuclei up to Z=30 (zinc) with sufficient statistics to get to the bottom of these and other mysteries. “We have measured many particles, electrons, positrons, antiprotons and many nuclei, and they all have distributions and none agree with current theoretical models. So we will begin to create a new field.”

Synchrotrons on the coronavirus frontline

Representation of the 3D structure of the main SARS-CoV-2 protease, obtained using Diamond Light Source. The coils represent “alpha” helices and the flatter arrows are “beta sheets”, with loops connecting them together. The organisation of alpha helices and beta sheets is often referred to as the secondary structure of the protein (with the primary sequence being the amino acid sequence and the tertiary structure being the overall 3D shape of the protein). Credit: D Owen/Diamond Light Source.

At a time when many countries are locking down borders, limiting public gatherings, and encouraging isolation, the Diamond Light Source in Oxfordshire, UK, has been ramping up its intensity, albeit in an organised and controlled manner. The reason: these scientists are working tirelessly on drug-discovery efforts to quell COVID-19.

It is a story that requires fast detectors, reliable robotics and powerful computing infrastructures, artificial intelligence, and one of the brightest X-ray sources in the world. And it is made possible by international collaboration, dedication, determination and perseverance.

Synchrotron light sources are particle accelerators capable of producing incredibly bright X-rays, by forcing relativistic electrons to accelerate on curved trajectories. Around 50 facilities exist worldwide, enabling studies over a vast range of topics. Fanning out tangentially from Diamond’s 562-m circumference storage ring are more than 30 beamlines equipped with instrumentation to serve a multitude of user experiments. The intensely bright X-rays (corresponding to flux of around 9 × 1012 photons per second) are necessary for determining the atomic structure of proteins, including the proteins which make up viruses. As such, synchrotron light sources around the world are interrupting their usual operations to work on mapping the structure of the SARS-CoV-2 virus.

Knowing the atomic structure of the virus is like knowing how the enemy thinks

Knowing the atomic structure of the virus is like knowing how the enemy thinks. A 3D visualisation of the building blocks of the structure at an atomic level would allow scientists to understand how the virus functions. Enzymes, the molecular machines that allow the virus to replicate, are key to this process. Scientists at Diamond are exploring the binding site of the main SARS-CoV-2 protease. A drug that binds to this enzyme’s active site would throw a chemical spanner in the works, blocking the virus’ ability to replicate and limiting the spread of the disease.

By way of reminder: Coronavirus is the family of viruses responsible for the common cold, MERS, SARS, etc. Novel coronavirus, aka SARS-CoV-2, is the newly discovered type of coronavirus, and COVID-19 is the disease which it causes.

Call to arms

On 26 January, Diamond’s life-sciences director, Dave Stuart, received a phone call from structural biologist Zihe Rao of ShanghaiTech University in China. Rao, along with his colleague Haitao Yang, had solved the structure of the main SARS-CoV-2 protease with a covalent inhibitor using the Shanghai Synchrotron Radiation Facility (SSRF) in China. Furthermore, they had made the solution freely and publicly available on the worldwide Protein Data Bank.

During the phone call, Rao informed Stuart that their work had been halted by a scheduled shutdown of the SSRF. The Diamond team rapidly mobilised. Since shipping biological samples from Shanghai at the height of the coronavirus in China was expected to be problematic, the team at Diamond ordered the synthetic gene. A synthetic gene can be generated provided the ordering of T, A, C and G nucleotides in the DNA sequence is known. That synthetic gene can be genetically engineered into a bacterium, in this case Escherichia. coli, which reads the sequence and generates the coronavirus protease in large enough quantities for the researchers at Diamond to determine its structure and screen for potential inhibitors.

Eleven days later on 10 February, the synthetic gene arrived. At this point, Martin Walsh, Diamond’s deputy director of life sciences, and his team (consisting of Claire Strain-Damerell, Petra Lukacik, and David Owen) dropped everything. With the gene in hand, the group immediately set up experimental trials to try to generate protein crystals. In order to determine the atomic structure, they needed a crystal containing millions of proteins in an ordered grid-like structure.

Diamond Light Source, the UK

X-ray radiation bright enough for the rapid analysis of protein structures can only be produced by a synchrotron light source. The X-rays are directed and focused down a beamline onto a crystal and, as they pass through it, they diffract. From the diffraction pattern, researchers can work backwards to determine the 3D electron density maps and the structure of the protein. The result is a complex curled ribbon-like structure with an intricate mess of twists and turns of the protein chain.

The Diamond team set up numerous trials trying to find the optimum conditions for crystallization of the SARS-CoV-2 protease to occur. They modified the pH, the precipitating compounds, chemical composition, protein to solution ratio… every parameter they could vary, they did. Every day they would produce a few thousand trials, of which only a few hundred would produce crystals, and even fewer would produce crystals of sufficient quality. Within a few days of receiving the gene, the first crystals were being produced. They were paltry and thin crystals but large enough to be tested on one of Diamond’s macromolecular crystallography beamlines.

Watching the results come through, Diamond postdoc David Owen described it as the first moment of intense excitement. With crystals that appeared to be “flat like a car wind shield,” he was dubious as to whether they would diffract at all. Nevertheless, the team placed the crystals in the beamline with a resignation that quickly turned into intense curiosity as the results started appearing before them. At that moment Owen remembers his doubts fading, as he thought, “this might just work!” And work it did. In fact, Owen recalls, “they diffracted beautifully.” These first diffraction patterns of the SARS-CoV-2 virus were recorded with a resolution of 1.9 Angstrom (1.9 × 10−10 m) — high enough resolution to see the position of all of the chemical groups that allow the protease to do its work.

By 19 February, through constant adjustments and learning, the team knew they could grow good-quality crystals quickly. It was time to bring in more colleagues. The XChem team at Diamond joined the mission to set up fragment-based screening – whereby a vast library of small molecules (“fragments”) are soaked into crystals of the viral protease. These fragments are significantly smaller and functionally simpler than most drug molecules and are a powerful approach to selecting candidates for early drug discovery. By 26 February, 600 crystals had been mounted and the first fragment screen launched. In parallel, the team had been making a series of sample to send to company in Oxford called Exscientia, which has set up an AI platform designed to expediate candidates in drug discovery.

Drug-discovery potential

As of early March, 1500 crystals and fragments have been analysed. Owen attributes the team’s success so far to the incredible amounts of data they could collect and analyse quickly. With huge numbers of data sets, they could pin down the parameters of the viral protease with a high degree of confidence. And with the synchrotron light source they were able to create and analyse the diffraction patterns rapidly. The same amount of data collected with a lab-based X-ray source would have taken approximately 10 years. At Diamond, they were able to collect the data in a few days of accumulated beamtime.

A close up view of some residues in the active site of the protein, where the sticks represent the protein molecules and the mesh represents the electron density. Credit: D Owen/Diamond Light Source.

Synchrotron light sources all over the world have been granting priority and rapid access to researchers to support their efforts in discovering more about the virus. Researchers at the Advanced Photon Source in Argonne, US, and at Elettra Sincrotrone in Trieste, Italy are also trying to identify molecules effective against COVID-19, in an attempt to bring us closer to an effective vaccine or treatment. This week, the ESRF in Grenoble, France, announced that it will make its cryo-electron microscope facility available for use. The community has a platform called www.lightsources.org offering an overview of access and calls for proposals.

Synchrotron light sources all over the world have been granting priority and rapid access

In addition to allowing the structure of tens of thousands of biological structures to be elucidated – such as that of the ribosome, which was recognised by the 2009 Nobel Prize in Chemistry — light sources have a strong pedigree in elucidating the structure of viruses. Development of common anti-viral medication that blocks the actions of virus in the body, such as Tamiflu or Relenza, also relied upon synchrotrons to reveal their atomic structure.

Mapping the SARS-CoV-2 protease structures bound to small chemical fragments, the Diamond team demonstrated a crystallography- and fragmentation-screen tour de force. The resulting and ongoing work is a crucial first step in developing a drug. Forgoing the usual academic root of peer-review, the Diamond team have made all of their results openly and freely available to help inform public heath response, limit the spread of the virus with the hope that this can fast-track effective treatment options.

Rolf Widerøe: a giant in the history of accelerators

The betatron is an early type of MeV-range electron accelerator which uses the electric field induced by a varying magnetic field to accelerate electrons, or beta particles. It operates like a transformer with the secondary winding replaced by a beam of electrons circulating in a vacuum tube. It was invented by pioneering Norwegian accelerator physicist Rolf Widerøe when a student in 1925. Since the construction failed at the time, he had to find another theme for his thesis, and so in 1927 he constructed the first linear accelerator (50 keV), before later proposing the principle of colliding beams to fully exploit the energy of accelerated particles. Through these innovations, Rolf Widerøe decisively influenced the course of high-energy physics, with betatrons shaping the landscape in the early days, and linear accelerators and colliding beams becoming indispensable tools today.

Obsessed by a Dream: The Physicist Rolf Widerøe – A Giant in the History of Accelerators, by Aashild Sørheim

Aashild Sørheim, a professional writer, now presents a new biography of this visionary engineer, who had a seminal impact on accelerator physics. Her book covers Widerøe’s whole life, from 1902 to 1996, and from his childhood in a well-to-do family in Oslo to his retirement in Switzerland. Certainly, many who read Pedro Waloscheck’s 1994 biography, The Infancy of Particle Accelerators: Life and Work of Rolf Widerøe, will be curious how this new book will complement the former. Sørheim‘s new offering is based on new documentary evidence, the result of painstaking sifting through archives, and a large number of interviews. She has opened new perspectives through her interviews, and the access she has gained in several countries to hitherto restricted archives has provided a wealth of new material and insights, in particular in relation to the second world war. Sørheim’s book focuses not on physics or technology, but on Widerøe himself, and the social and political environment in which he had to find his way. In particular, it gravitates to the question of his motivation to work in Germany in the troubled years from 1943 to 1945, when he constructed a betatron, the accelerator he had invented two decades earlier while a student in Karlsruhe.

Occupied Oslo

In the most interesting parts, the book provides background information about the entanglement of science, industrial interests and armament, and in particular the possible reasons for the “recruitment” of Rolf Widerøe in occupied Oslo in the spring of 1943 by three German physicists mandated by the German air force, who insinuated that willingness to cooperate might well help to improve the conditions of his brother Viggo, who was in prison in Germany for helping Norwegians escape to England. The apparent motivation was that a powerful betatron could produce strong enough X-rays to neutralise allied bomber pilots. Though leading German scientists quickly discovered this to be nonsense, the betatron project was not interrupted. The book describes the difficult working conditions in Hamburg, and the progress towards a 15 MeV betatron. Among the key players was Widerøe’s assistant Bruno Touschek, who was finally arrested by  the Gestapo in 1945 as his mother was Jewish. It was during this time that Widerøe patented his idea to use colliding beams to maximise the energy available, against the advice of Touschek, who found the idea too trivial to publish. It was the Touschek though, who in 1961 used first used this principle in ADA, the e+e ring in Frascati which was the first collider of the world.

Widerøe faced official prosecution on the ludicrous charge of having helped develop V2 rockets

After Widerøe’s return to Oslo in March 1945, when the betatron was operational and the advancing English army made a study of a 200 MeV betatron illusionary, he faced official prosecution on the ludicrous main charge of having helped develop V2 rockets, explains Sørheim. Released from prison after 47 days, he got away without trial, but had to pay a substantial fine. Unemployed, seeing no basis for pursuing his dream of further developing betatrons in his home country, and with the stigma of a collaborator in the understandably overheated atmosphere of the time, he moved his family to Switzerland in 1946. One chapter, strangely put near the beginning of the book, describes how Widerøe then became a successful leader of the betatron production at Brown-Boveri in Switzerland, a respected lecturer at the ETH in Zurich and a promoter of radiation therapy until late into his retirement. He was a CERN consultant in the early days, and worked with Odd Dahl and Frank Goward in Brookhaven 1952 where they became acquainted with the alternating-gradient focusing principle which was then boldly proposed to the CERN Council as basis for the design of the 25 GeV Proton Synchrotron.

The book leaves the reader somehow overwhelmed by the amount of material presented, the non-chronological presentation, and the many repetitions of the same facts, conveying the impression that the author had difficulty in putting the information in a coherent order. However, the many interviews and new documentary evidence, including a hitherto unknown letter from his brother Viggo, open novel perspectives on this extraordinary engineer and scientist who, besides receiving many honours abroad, finally also received recognition in his home country, after a lengthy reconciliation process.

A unique exercise in scientific diplomacy

The International Thermonuclear Experimental Reactor — now simply ITER — is a unique exercise in scientific diplomacy, and a politically driven project. It is also the largest international collaboration, and a milestone in the technological history of mankind. These, I would say, are the main conclusions of Michel Claessens’ new book ITER: The Giant Fusion Reactor. He unfolds a fascinating story which criss-crosses more than 40 years of the history of nuclear fusion in a simple, but not simplistic, way which is accessible to anyone with a will to stick to facts without prejudices. The full range of opinions on ITER’s controversial benefits and detriments are exposed and discussed in a fair way, and the author never hides his personal connection to the project as its head of communications for many years.

ITER Claessens cover

Why don’t we more resolutely pursue a technology that could contribute to the production of carbon-free energy? ITER’s path has been plagued by rivalries between strong personalities, and difficult technical and political decisions, though, in retrospect, few domains of science and technology have received such strong and continuous support from governments and agencies. Claessens’ book begins by discussing the need for fusion among other energy sources — he avoids selling fusion as the “unique and final” solution to energy problems — and quickly brings us to the heart of a key problem humanity is facing today. Travelling through history, the author shows that when politicians take decisions of high inspiration, as at the famous fireside summit between presidents Reagan and Gorbachev in Geneva in November 1985, where the idea for a collaborative project to develop fusion energy for peaceful purposes was born, they change the course of history — for the better! The book then goes through the difficulties of setting up a complex project animated by a political agenda (fusion was on the agenda of political summits between the USA and the USSR since the cold war) without a large laboratory backing it up.

The author shows that when politicians take decisions of high inspiration they change the course of history

Progress with ITER was made more difficult by a complex system of in-kind contributions which were not optimised for cost or technical success, but for political “return” to each member state of ITER (Europe, China, Japan, Russia, South Korea, the US, and most recently India). Claessens’ examples are striking, and he doesn’t skirt around the inevitable hot questions: what is the real cost of ITER? Will it even be finished given its multiple delays? How much of these extra costs and delays are due to the complex and politically oriented governance structures established by the partners? The answers are clear, honestly reported, and quantitative, though the author makes it clear that the numbers should be taken cum grano salis. Assessing the cost of a project where 90% of the components are in-kind contributions, with each partner having its own accounting structures, and in certain cases no desire to reveal the real cost, is a doubtful enterprise. However, we can say with some certainty that ITER is taking twice as long and likely costing more than double what was initially planned — and as the author says on more than one occasion, further delays will likely entail additional costs. By comparison, the LHC needed roughly an additional 25% in both budget and time compared to what was initially planned.

Price tag

Was the initial cost estimate for ITER simply too low, perhaps to help the project get approved, or would a better management, with a different governance structure, have performed better? Significantly, I have not met a single knowledgeable person who did not strongly express that ITER is a textbook case of bad management organisation, though in my opinion the book does not do justice to the energetic action of the current director general, Bernard Bigot. His directorate has been a turning point in ITER’s construction, and has set the project back on track in a moment of real crisis when many scientists and mangers expected the project to fail. A key question surfaces in the book: is the price tag important? ITER’s cost is peanuts compared to the EU’s budget, for example, and the cost is not significant by comparison to the promise it delivers: carbon-free energy in large quantities, at an affordable cost to environment, and based on widely distributed fuel.

Michel Claessens’ book explores different points of view without fanaticism

Though there is almost no intrinsic innovation in ITER, Claessens shows how the project has nevertheless pushed tokamak technology beyond its apparent limits by a sheer increase in size, though he neglects some key points, such as the incredible stored energy of the superconducting magnets. An incident similar to that suffered by the LHC in 2008 would be a logistical nightmare for ITER, as it contains more than three times the stored energy of the entire LHC and its detectors in an incomparably smaller volume. Comparisons with CERN are however a feature throughout the book, and a point of pride for high-energy physicists — clearly, CERN has set the standard for high-tech international collaboration, and ITER has tried to follow its example (CERN Courier October 2014 p45). Having begun my career as a plasma scientist, before turning to accelerators at the beginning of the 1980s, I know some of the stories and personalities involved, including CERN’s former Director General, and recognised father of ITER, Robert Aymar, and ITER’s head of superconductor procurement, my close friend Arnaud Devred, also now of CERN.

I recommend Michel Claessens’ well written and easy-to-read book. It is passionate and informative and explores different points of view without fanaticism. Interestingly, his conclusion is not scientific or political, but socio-philosophical in nature: ITER will be built because it can be, he says, according to a principle of “technological necessity”.

HL-LHC superconducting quadrupole successfully tested

The quadrupole magnet being prepared for a test at Brookhaven National Laboratory. Credit: Brookhaven National Laboratory

A quadrupole magnet for the high-luminosity LHC (HL-LHC) has been tested successfully in the US, attaining a conductor peak field of 11.4 T – a record for a focusing magnet ready for installation in an accelerator. The 4.2 m-long, 150-mm-single-aperture device is based on the superconductor niobium tin (Nb3Sn) and is one of several quadrupoles being built by US labs and CERN for the HL-LHC, where they will squeeze the proton beams more tightly within the ATLAS and CMS experiments to produce a higher luminosity. The result follows successful tests carried out last year at CERN of the first accelerator-ready Nb3Sn dipole magnet, and both of these milestones are soon to be followed by tests of other 7.2 m and 4.2 m quadrupole magnets at CERN and the US.

“This copious harvest comes after significant recent R&D on niobium-tin superconducting magnet technology and is the best answer to the question if HL-LHC is on time: it is,” says HL-LHC project leader Lucio Rossi of CERN. “We should also underline that this full-length, accelerator-ready magnet performance record is a real textbook case for international collaboration in the accelerator domain: since the very beginning the three US labs and CERN teamed up and managed to have a common and very synergic R&D, particularly for the quadrupole magnet that is the cornerstone of the upgrade. This has resulted in substantial savings and improved output.”

This is a real textbook case for international collaboration in the accelerator domain

Lucio Rossi

The current LHC magnets, which have been tested to a bore field of 8.3 T and are currently operated at 7.7 T at 1.9 K for 6.5 TeV operation, are made from the superconductor niobium-titanium (Nb-Ti). As the transport properties of Nb-Ti are limited for fields beyond 10-11 T at 1.9 K, HL-LHC magnets call for a move to Nb3Sn, which remain superconducting for much higher fields. Although Nb3Sn has been studied for decades and is already in widespread use in solenoids for NMR — not to mention underpinning the large coils, presently being manufactured, that will be used to contain and control the plasma in the ITER fusion experiment – it is more challenging than Nb-Ti to work with: once formed, the Nb3Sn compound becomes brittle and strain sensitive and therefore much harder than niobium-titanium alloy to process into cables to be wound with the accuracy required to achieve the performance and field quality of state-of-the-art accelerator magnets.

Researchers at Fermilab, Brookhaven National Laboratory and Lawrence Berkeley National Laboratory are to provide a total of 16 quadrupole magnets for the interactions regions of the HL-LHC, which is due to operate from 2027. The purpose of a quadrupole magnet is to produce a field gradient in the radial direction with respect to the beam, allowing charged-particle beams to be focused. A test was carried out at Brookhaven in January, when the team operated the 8-tonne quadrupole magnet continuously at a nominal field gradient of around 130 T/m and a temperature of 1.9 K for five hours. Eight longer quadrupole magnets (each providing an equivalent “cold mass” as two US quadrupole magnets) are being produced by CERN.

It’s a very cutting-edge magnet

Kathleen Amm

“We’ve demonstrated that this first quadrupole magnet behaves successfully and according to design, based on the multiyear development effort made possible by DOE investments in this new technology,” said Fermilab’s Giorgio Apollinari, head of the US Accelerator Upgrade Project in a Fermilab press release. “It’s a very cutting-edge magnet,” added Kathleen Amm, who is Brookhaven’s representative for the project.

Dipole tests at CERN

In addition to stronger focusing magnets, the HL-LHC requires new dipole magnets positioned on either side of a collimator to correct off-momentum protons in the high-intensity beam. To gain the required space in the magnetic lattice, Nb3Sn dipole magnets of shorter length and higher field than the current LHC dipole magnets are needed. In July 2019 the CERN magnet group successfully tested a full-length, 5.3-m, 60-mm-twin-aperture dipole magnet – the longest Nb3Sn magnet tested so far – and achieved a nominal bore field of 11.2 T at 1.9 K (corresponding to a conductor peak field of 11.8 T).

“This multi-year effort on Nb3Sn, which we are running together with the US, and our partner laboratories in Europe, is leading to a major breakthrough in accelerator magnet technology, from which CERN, and the whole particle physics community, will profit for the years to come,” says Luca Bottura, head of the CERN magnet group.

The dipole- and quadrupole-magnet milestones also send a positive signal about the viability of future hadron colliders beyond the LHC, which are expected to rely on Nb3Sn magnets with fields of up to 16 T. To this end, CERN and the US labs are achieving impressive results in the performance of Nb3Sn conductor in various demonstrator magnets. In February, the CERN magnet group produced a record field of 16.36 T at 1.9 K (16.5 T conductor peak field) in the centre of a short “enhanced racetrack model coil” demonstrator, with no useful aperture, which was developed in the framework of the Future Circular Collider study. In June 2019, as part of the US Magnet Development Programme, a short “cos-theta” dipole magnet with an aperture of 60 mm reached a bore field of 14.1 T at 4.5 K at Fermilab. Beyond magnets, says Rossi, the HL-LHC is also breaking new ground in superconducting-RF crab cavities, advanced material collimators and 120 kA links based on novel MgB2 superconductors.

Next steps

Before they can constitute fully operational accelerator magnets which could be installed in the HL-LHC, both these quadrupole magnets and the dipole magnets must be connected in pairs (the longer CERN quadrupole magnets are single units). Each magnet in a pair has the same winding, and differs only in its mechanical interfaces and details of its electrical circuitry. Tests of the remaining halves of the quadrupole- and dipole-magnet pairs were scheduled to take place in the US and at CERN during the coming months, with the dipole magnet pairs to be installed in the LHC tunnel this year. Given the current global situation, this plan will have to be reviewed, which is now the high-priority discussion within the HL-LHC project.

Plasma polarised by spin-orbit effect

Figure 1

Spin-orbit coupling causes fine structure in atomic physics and shell structure in nuclear physics, and is a key ingredient in the field of spintronics in materials sciences. It is also expected to affect the development of the quickly rotating quark–gluon plasma (QGP) created in non-central collisions of lead nuclei at LHC energies. As such plasmas are created by the collisions of lead nuclei that almost miss each other, they have very high angular momenta of the order of 107ħ – equivalent to the order of 1021 revolutions per second. While the extreme magnetic fields generated by spectating nucleons (of the order of 1014 T, CERN Courier Jan/Feb 2020 p17) quickly decay as the spectator nucleons pass by, the plasma’s angular momentum is sustained throughout the evolution of the system as it is a conserved quantity. These extreme angular momenta are expected to lead to spin-orbit interactions that polarise the quarks in the plasma along the direction of the angular momentum of the plasma’s rotation. This should in turn cause the spins of vector (spin-1) mesons to align if hadronisation proceeds via the recombination of partons or by fragmentation. To study this effect, the ALICE collaboration recently measured the spin alignment of the decay products of neutral K* and φ vector mesons produced in non-central Pb–Pb collisions.

Spin alignment can be studied by measuring the angular distribution of the decay products of the vector mesons. It is quantified by the probability ρ00 of finding a vector meson in a spin state 0 with respect to the direction of the angular momentum of the rotating QGP, which is approximately perpendicular to the plane of the beam direction and the impact parameter of the two colliding nuclei. In the absence of spin-alignment effects, the probability of finding a vector meson in any of the three spin states (–1, 0, 1) should be equal, with ρ00 = 1/3.

The ALICE collaboration measured the angular distributions of neutral K* and φ vector mesons via their hadronic decays to Kπ and KK pairs, respectively. ρ00 was found to deviate from 1/3 for low-pT and mid-central collisions at a level of 3σ (figure 1). The corresponding results for φ mesons show a deviation of ρ00 values from 1/3 at a level of 2σ. The observed pT dependence of ρ00 is expected if quark polarisation via spin-orbit coupling is subsequently transferred to the vector mesons by hadronisation, via the recombination of a quark and an anti-quark from the quark–gluon plasma. The data are also consistent with the initial angular momentum of the hot and dense matter being highest for mid-central collisions and decreasing towards zero for central and peripheral collisions.

The results are surprising as studies with Λ hyperons are compatible with zero

The results are surprising, however, as corresponding quark-polarisation values obtained from studies with Λ hyperons are compatible with zero. A number of systematic tests have been carried out to verify these surprising results. K0S mesons do indeed yield ρ00 = 1/3, indicating no spin alignment, as must be true for a spin-zero particle. For proton–proton collisions, the absence of initial angular momentum also leads to ρ00 = 1/3, consistent with the observed neutral K* spin alignment being the result of spin-orbit coupling.

The present measurements are a step towards experimentally establishing possible spin-orbit interactions in the relativistic-QCD matter of the quark–gluon plasma. In the future, higher statistics measurements in Run 3 will significantly improve the precision, and studies with the charged K*, which has a magnetic moment seven times larger than neutral K*, may even allow a direct observation of the effect of the strong magnetic fields initially experienced by the quark–gluon plasma.

Einstein and Heisenberg: The Controversy over Quantum Physics

Einstein and Heisenberg: The Controversy over Quantum Physics

This attractive and exciting book gives easy access to the history of the two main pillars of modern physics of the first half of the 20th century: the theory of relativity and quantum mechanics. The history unfolds along the parallel biographies of the two giants in these fields, Albert Einstein and Werner Heisenberg. It is a fascinating read for everybody interested in the science and culture of their time.

At first sight, one could think that the author presents a twin biography of Einstein and Heisenberg, and that’s all. However, one quickly realises that there is much more to this concise and richly illustrated text. Einstein and Heisenberg’s lives are embedded in the context of their time, with emphasis given to explaining the importance and nature of their interactions with the physicists of rank and name around them. The author cites many examples from letters and documents for both within their respective environments, which are most interesting to read, and illustrate well the spirit of the time. Direct interactions between both heroes of the book were quite sparse though.

At several stages throughout the book, the reader will become familiar with the personal life stories of both protagonists, who were, in spite of some commonalities, very different from each other. Common to both, for instance, was their devotion to music and their early interest and outstanding talent in physics as boys at schools in Munich, but on the contrary they were very different in their relations with family and partners, as the author discusses in a lively way. Many of these aspects are well known, but there are also new facets presented. I liked the way this is done, and, in particular, the author does not shy away from also documenting the perhaps less commendable human aspects, but without judgement, leaving the reader to come to their own conclusion.

Topics covering a broad spectrum are commented on in a special chapter called “Social Affinities”. These include religion, music, the importance of family, and, in the case of Einstein, his relation to his wives and women in general, the way he dealt with his immense public reputation as a super scientist, and also his later years when he could be seen as “scientifically an outsider”. In Heisenberg’s case, one is reminded of his very major contributions to the restoration of scientific research in West Germany and Europe after World War II, not least of course in his crucial founding role in the establishment of CERN.

Do not expect a systematic, comprehensive introduction to relativity and quantum physics; this is not a textbook. Its great value is the captivating way the author illustrates how these great minds formed their respective theories in relation to the physics and academic world of their time. The reader learns not only about Einstein and Heisenberg, but also about many of their contemporary colleagues. A central part in this is the controversy about the interpretation of quantum mechanics among Heisenberg’s colleagues and mentors, such as Schrödinger, Bohr, Pauli, Born and Dirac, to name just a few.

Another aspect of overriding importance for the history of that time was of course the political environment spanning the time from before World War I to after World War II. Both life trajectories were influenced in a major way by these external political and societal factors. The author gives an impressive account of all these aspects, and sheds light on how the pair dealt with these terrible constraints, including their attitudes and roles in the development of nuclear weapons.

A special feature of the book, which will make it interesting to everybody, is the inclusion of various hints as to where relativity and quantum mechanics play a direct role in our daily lives today, as well as in topical contemporary research, such as the recently opened field of gravitational-wave astronomy.

This is an ambitious book, which tells the story of the birth of modern physics in a well-documented and well-illustrated way. The author has managed brilliantly to do this in a serious, but nevertheless entertaining, way, which will make the book a pleasant read for all.

Protons herald new cardiac treatment

The 80 m-circumference synchrotron at CNAO

In a clinical world-first, a proton beam has been used to treat a patient with a ventricular tachycardia, which causes unsynchronised electrical impulses that prevent the heart from pumping blood. On 13 December, a 150 MeV beam of protons was directed at a portion of tissue in the heart of a 73-year-old male patient at the National Center of Oncological Hadrontherapy (CNAO) in Italy – a facility set out 25 years ago by the TERA Foundation and rooted in accelerator technologies developed in conjunction with CERN via the Proton Ion Medical Machine Study (PIMMS). The successful procedure had a minimal impact on the delicate surrounding tissues, and marks a new path in the rapidly evolving field of hadron therapy.

The use of proton beams in radiation oncology, first proposed in 1946 by founding director of Fermilab Robert Wilson, allows a large dose to be depo­sited in a small and well-targeted volume, reducing damage to healthy tissue surrounding a tumour and thereby reducing side effects. Upwards of 170,000 cancer patients have benefitted from proton therapy at almost 100 centres worldwide, and demand continues to grow (CERN Courier January/February 2018 p32).

The choice by clinicians in Italy to use protons to treat a cardiac pathology was born out of necessity to fight an aggressive form of ventricular tachycardia that had not responded effectively to traditional treatments. The idea is that the Bragg peak typical of light charged ions (by which a beam can deposit a large amount of energy in a small region) can produce small scars in the heart tissues similar to the ones caused by the standard invasive technique of RF cardiac ablation. “To date, the use of heavy particles (protons, carbon ions) in this area has been documented in the international scientific literature only on animal models,” said Roberto Rordorf, head of arrhythmology at San Matteo Hospital, in a press release on 22 January. “The Pavia procedure appears to be the first in the world to be performed on humans and the first results are truly encouraging. For this reason, together with CNAO we are evaluating the feasibility of an experimental clinical study.”

Hadron therapy for all

CNAO is one of just six next-generation particle-therapy centres in the world capable of generating beams of protons and carbon ions, which are biologically more effective than protons in the treatment of radioresistant tumours. The PIMMS programme from which the accelerator design emerged, carried out at CERN from 1996 to 2000, aimed to design a synchrotron optimised for ion therapy (CERN Courier January/February 2018 p25). The first dual-ion treatment centre in Europe was the Heidelberg Ion-Beam Therapy Centre (HIT) in Germany, designed by GSI, which treated its first patient in 2009. CNAO followed in 2011 and then the Marburg Ion-Beam Therapy Centre in Germany (built by Siemens and operated by Heidelberg University Hospital since 2015). Finally, MedAustron in Austria, based on the PIMMS design, has been operational since 2016. Last year, CERN launched the Next Ion Medical Machine Study (NIMMS) as a continuation of PIMMS to carry out R&D into the superconducting magnets, linacs and gantries for advanced hadron therapy. NIMMS will also explore ways to reduce the cost and footprint of hadron therapy centres, allowing more people in different regions to benefit from the treatment (CERN Courier March 2017 p31).

I think that in 20 years’ time cardiac arrhythmias will be mostly treated with light-ion accelerators

“When I decided to leave the spokesmanship of the DELPHI collaboration to devote my time to cancer therapy with light-ion beams I could not imagine that, 30 years later, I would have witnessed the treatment of a ventricular tachycardia with a proton beam and, moreover, that this event would have taken place at CNAO, a facility that has its roots at CERN,” says TERA founder Ugo Amaldi. “The proton treatment recently announced, proposed to CNAO by cardiologists of the close-by San Matteo Hospital to save the life of a seriously ill patient, is a turning point. Since light-ion ablation is non-invasive and less expensive than the standard catheter ablation, I think that in 20 years’ time cardiac arrhythmias will be mostly treated with light-ion accelerators. For this reason, TERA has secured a patent on the use of ion linacs for heart treatments.”

LHC and RHIC heavy ions dovetail in Wuhan

The 28th International Conference on Ultrarelativistic Nucleus-Nucleus Collisions, also known as “Quark Matter”, took place in Wuhan, China, in November. More than 800 participants discussed the latest results of the heavy-ion programmes at the Large Hadron Collider and at Brookhaven’s Relativistic Heavy-Ion Collider (RHIC), as well as the most recent theoretical developments. The focus of these studies is the fundamental understanding of strongly interacting matter at extremes of temperature and density. In these conditions, which also characterise the early universe, matter is a quark-gluon plasma (QGP), in which quarks and gluons are not confined within hadrons. In the recent editions of Quark Matter, much attention has also been devoted to the study of emergent QCD phenomena in high-multiplicity proton-proton and proton-nucleus collisions, which resemble the collective effects seen in nucleus-nucleus collisions and pose the intriguing question of whether a QGP can also form in “small-system” collisions.

The LHC and RHIC together cover a broad range of quark-gluon-plasma temperatures

The large data sample from the Pb-Pb period of LHC Run 2 in 2018 allowed ALICE, ATLAS, CMS and LHCb to study rare probes of the QGP, such as jets and heavy quarks, with unprecedented precision. New constraints on the energy loss of partons when traversing the high-density medium were presented, pushing the limits of jet measurements to lower transverse momenta and larger radii: jet modifications are now measured in the transverse momentum range from 40 to 1000 GeV/c and in the jet radius (resolution parameter) range 0.2 to 1. The internal structure of jets was studied not only by the LHC experiments, but also by the PHENIX and STAR collaborations at the 25-times lower RHIC collision energy. LHC and RHIC measurements are complementary as they cover a broad range of QGP temperatures and differ in the balance of quark- and gluon-initiated jets, with the former dominating at RHIC and the latter dominating at the LHC.  

New probes

Measurements in the sectors of heavy quarks and rarely-produced light nuclei (such as deuterons, 3He and hypertriton, a pnΛ bound state) also strongly benefitted from the large recent samples recorded at the LHC. In particular, their degree of collective behaviour could be studied in much greater detail. The family of QGP probes in the heavy-quark sector has been extended with new members at the LHC by first observations of the X(3872) exotic hadron and of top-antitop quark production. In the sector of electromagnetic processes, new experimental observations were presented for the first time at the conference, including the photo-production of dileptons in collisions with and without hadronic overlap, and light-by-light scattering. These effects are induced by the interaction of the strong electromagnetic fields of the two Pb nuclei (Z=82) passing close to each other (CERN Courier January/February 2020, p17).  

In nuclear collisions the fluid-dynamical flow of the QGP leaves an imprint in the azimuthal distribution of soft particles, as the initial geometry of the collision is translated to flow through pressure gradients. Its experimental trace is multi-particle angular correlations between low-momentum particles, even at large rapidity separations. In non-central nucleus-nucleus collisions that have an elliptical initial geometry, the resulting azimuthal modulation of particles momentum distribution is called elliptic flow. New information on collective behaviour and on the dynamics of heavy-quark interactions in the QGP was added by a first measurement of the D-meson momentum distribution down to zero momentum in Pb-Pb collisions at the LHC, and by new measurements of the elliptic flow of D mesons, muons from charm and beauty decays as well as bound states of heavy quarks (charmonia and bottomonia). These measurements suggest a stronger degree of collective behaviour for light than heavy quarks, and further constrain estimates of the QGP viscosity. Such estimates also require understanding of heavy-quark hadronisation, which was discussed in the light of new results at RHIC and the LHC which indicate an increased production of charmed baryons with respect to mesons, at low momentum in both pp and nucleus-nucleus collisions, when compared to expectations from electron-positron collisions. 

The situation is much less clear in the collisions of small systems

While there is strong evidence for the production of QGP in nuclear collisions, the situation is much less clear in the collisions of small systems. The momentum correlations and azimuthal modulation that characterise the large nuclear collisions were also observed in smaller collision systems, such as p-Pb at the LHC, p-Au, d-Au and 3He-Au at RHIC, and even pp. The persistence of these correlations in smaller collision systems, down to pp collisions where it is unlikely that an equilibrated system could be created, may offer an inroad to understand how the collective behaviour of the QGP arises from the microscopic interaction of its individual constituents. New measurements on multi-particle correlations were presented and the dynamical origin of the collectivity in small systems was discussed. Small expanding QGP droplets, colour connections of overlapping QCD strings, and final-state rescattering at partonic or hadronic level are among the possible mechanisms that are proposed to describe these observations. While many signs characteristic of the QGP are seen in the small-system collisions, parton energy loss (in the form of jet or large-momentum hadron modifications) remains absent in the measurements carried out to date. 

The future

Beyond Quark Matter 2019, the field is now looking forward to the future programmes at the LHC and at RHIC, which were extensively reviewed at the conference. At the LHC, the heavy-ion injectors and the experiments are currently being upgraded. In particular, the heavy-ion-dedicated ALICE detector is undergoing major improvements, with readout and tracker upgrades that will provide larger samples and better performance for heavy-flavour selection. Run 3 of the LHC, which is scheduled to start in 2021, will provide integrated luminosity increases ranging from one order of magnitude for the data samples based on rare triggers to two orders of magnitude for the minimum-bias (non-triggered) samples. At RHIC, the second beam-energy-scan programme is now providing the STAR experiment with higher precision data to search for the energy evolution of QGP effects, and the new sPHENIX experiment aims at improved measurements of jets and heavy quarks from 2023. Low-energy programmes at the CERN SPS, NICA, FAIR, HIAF and J-PARC, which target a systematic exploration of heavy-ion collisions with high baryon density to search for the onset of deconfinement and the predicted QCD critical point, were also discussed in Wuhan, and the updated plans for the US-based Electron-Ion Collider (EIC), which is foreseen to be constructed at Brookhaven National Laboratory, were presented. With ep and e-nucleus interactions, the EIC will provide unprecedented insights into the structure of the proton and the modification of parton densities in nuclei, which will benefit our understanding of the initial conditions for nucleus-nucleus collisions. 

bright-rec iop pub iop-science physcis connect