Comsol -leaderboard other pages

Topics

Third Thoughts

By Steven Weinberg
The Belknap Press of Harvard University Press

Third Thoughts

When Nobel laureates offer their point of view, people generally are curious to listen. Self-described rationalist, realist, reductionist and devoutly secular, Steven Weinberg has published a new book reflecting on current affairs in science and beyond. In Third Thoughts, he addresses themes that are of interest for both laypeople and researchers, such as the public funding of science.

Weinberg shared the Nobel Prize in Physics in 1979 for unifying the weak interaction and electromagnetism into the electroweak theory, the core of the Standard Model, and has made many other significant contributions to physics. At the same time, Weinberg has been and remains a keen science populariser. Probably his most famous work is the popular-science book The First Three Minutes, where he recounts the evolution of the universe immediately following the Big Bang.

Third Thoughts is his third collection of essays for non-specialist readers, following Lake Views (2009) and Facing Up (2001). In it are 25 essays divided into four themes: science history, physics and cosmology, public matters, and personal matters. Some are the texts of speeches, some were published previously in The New York Review of Books, and others are released for the first time.

The essays span subjects from quantum mechanics to climate change, from broken symmetry to cemeteries in Texas, and are pleasantly interspersed with his personal life stories. Like his previous collections, Weinberg deals with topics that are dear to him: the history of science, science spending, and the big questions about the future of science and humanity.

The author defines himself as an enthusiastic amateur in the history of science, albeit a “Whig interpreter” (meaning that he evaluates past scientific discoveries by comparing them to the current advancements – a method that irks some historians). Beyond that, his taste for controversy encourages him to cogitate over Einstein’s lapses, Hawking’s views, the weaknesses of quantum mechanics and the US government’s financing choices, among others.

Readers who are interested in US politics will find the section “Public matters” very thought-provoking. In particular, the essay “The crisis of big science” is based on a talk he gave at the World Science Festival in 2011 and later published in the New York Review of Books. He explains the need for big scientific projects, and describes how both cosmology and particle physics are struggling for governmental support. Though still disappointed by the cut of the Superconducting Super Collider (SSC) in the early 1990s, he is excited by the new endeavours at CERN. He reiterates his frank opinions against manned space flight, and emphasises how some scientific obstacles are intertwined in the historical panorama. In this way, Weinberg sets the cancellation of the SSC in a wider problematic context, where education, healthcare, transportation and law enforcement are under threat.

The author condenses the essence of what physicists have learnt so far about the laws of nature and why science is important. This is a book about asking the right questions, when time is ripened to look for the answers. He explains that the question “What is the world made of?” needed to wait for chemistry advances at the end of the 18th century. “What is the structure of the electron?” needed to wait for quantum mechanics. While “What is an elementary particle?” is still waiting for an answer.

The essays vary in difficulty, and some concepts and views are repeated in several essays, thus each of them can be read independently. While most are digestible for readers without any background knowledge in particle physics, a general understanding of the Standard Model would help with grasping the content of some of the paragraphs. Having said that, the general reader can still follow the big picture and logically-argued thoughts.

Several essays talk about CERN. More specifically, the “The Higgs, and beyond” article was written before the announcement of the Higgs boson discovery in 2011, and briefly presents the possibility of technicolour forces. The following essay, “Why the Higgs?”, was commissioned just after the announcement in 2012 to explain “what all the fuss is about”.

One of the most curious essays to explore is number 24. Citing Weinberg: “Essay 24 has not been published until now because everyone who read it disagreed with it, but I am fond of it so bring it out here.” There, he draws parallels between his job as a theoretical physicist and the one of creative artists.

Not all scientists are able to write in such an unconstrained and accessible way. Despair, sorrow, frustration, doubt, uneasiness and wishes all emerge page after page, offering the reader the privilege of coming closer to one of the sharpest scientific minds of our era.

From Stars to States: A Manifest for Science in Society

By Thierry Courvoisier
Springer

From Stars to States

This book is a curiosity, but like many curiosities, well worth stumbling across. It is the product of a curious, roving mind with a long and illustrious career dedicated to the exploration of nature and the betterment of society. Pieced together with cool scientific logic, it takes the reader from a whistle-stop tour of modern astronomy through the poetry collection of Jocelyn Bell-Burnell, to a science-inspired manifesto for the future of our planet. After an opening chapter tracing the development of astronomy from the 1950s to now, subsequent chapters show how gazing at the stars, and learning from doing so, has brought benefit to people from antiquity to modern times across a wide range of disciplines.

Astronomy helped our ancestors to master time, plant crops at the right moment, and navigate their way across wide oceans. There’s humour in the form of speculation about the powers of persuasion of those who convinced the authorities of the day to build the great stone circles that dot the ancient world, allowing people to take time down from the heavens. These were perhaps the Large Hadron Colliders of their time, and, in Courvoisier’s view, probably took up a considerably larger fraction of ancient GDP (gross domestic product) than modern scientific instruments. John Harrison’s remarkable clocks are given pride of place in the author’s discussion of time, though the perhaps even more remarkable Antikythera mechanism is strangely absent.

By the time we reach chapter three, the beginnings of a virtuous circle linking basic science to technology and society are beginning to appear, and we can start to guess where Courvoisier is taking us. The author is not only an emeritus professor of astronomy at the University of Geneva, but also a former president of the Swiss Academy of Sciences and current president of EASAC, the European Academies Science Advisory Council. For good measure, he is also president of the H Dudley Wright Foundation, a charitable organisation that supports science communication activities, mainly in French-speaking Switzerland. He is, in short, a living, breathing link between science and society.

In chapter four, we enjoy the cultural benefits of science and the pleasure of knowledge for its own sake. We have a glimpse of what in Swiss German is delightfully referred to as Aha Erlebnis – that eureka moment when ideas just fall into place. It reminded me of the passage in another curious book, Kary Mullis’s Dancing Naked in the Mindfield, in which Mullis describes the Aha Erlebnis that led to him receiving the Nobel Prize in Chemistry in 1993. It apparently came to him so strongly out of the blue on a night drive along a California freeway that he had to pull off the road and write it down. Einstein’s famous 1% inspiration may be rare, but what a wonderful thing it is when it happens.

Chapter five begins the call to action for scientists to take up the role that their field demands of them in society. “We still need to generate the culture required to […] bring existing knowledge to places where it can and must contribute to actions fashioning the world.” Courvoisier examines the gulf between the rational world of science and the rather different world of policy – a gulf once memorably described by Lew Korwarski in his description of the alliance between scientists and diplomats that led to the creation of CERN. “It was a pleasure to watch the diplomats grapple with the difference between a cyclotron and a plutonium atom,” he said. “We had to compensate by learning how to tell a subcommittee from a working party, and how – in the heat of a discussion – to address people by their titles rather than their names. Each side began to understand the other’s problems and techniques; a mutual respect grew in place of the traditional mistrust between egg-headed pedants and pettifogging hair-splitters.” CERN is the resulting evidence for the good that comes when science and policy come together.

As we reach the business end of the book, we find a rallying call for strengthening our global institutions, and here another of Courvoisier’s influences comes to the fore. He’s Swiss, and a scientist. Scientists have long understood the benefits of collaboration, and if there is one country in the world that has managed to reconcile the nationalism of its regions with the greater need of the supra-cantonal entity of the country as a whole, it is Switzerland. It would be a gross oversimplification to say that Courvoisier’s manifesto is to apply the Swiss model to global governance, but you get the idea.

Originally published in French by the Geneva publisher Georg, if there’s one criticism I have of the book, it’s the translation. It made Catherine Bréchignac, who speaks with fluidity in French, come across as rather clunky in her introduction, and on more than one occasion I found myself wondering if the words I was reading were really expressing what the author wanted to say. Springer and the Swiss Academy of Sciences are to be lauded for bringing this manifesto to an Anglophone audience, but for those who read French, I’d recommend the original.

Particle interactions up to the highest energies

The 20th International Symposium on Very High Energy Cosmic Ray Interactions (ISVHECRI 2018) was held in Nagoya, Japan, on 21–25 May. More than 120 attendees from 19 countries discussed various aspects of hadronic interactions at the intersection between high-energy cosmic-ray physics and classical accelerator-based particle physics. The 65 contributions reflected the large diversity and interdisciplinary character of this biennial series, which is held under the auspices of the International Union of Pure and Applied Physics.

In his opening address, Sunil Gupta paid a tribute to Oscar Saavedra, one of the leading scientists and founders of the ISVHECRI series, who passed away in 2018. Following the long tradition of this symposium series, the main topic was the discussion of particle physics of relevance to extensive air showers, secondary cosmic-ray production, and hadronic multi-particle production at accelerators. This time, the symposium expanded its coverage of multi-messenger astrophysics, especially to neutrino and gamma-ray astrophysics. Many talks were invited from the Pierre Auger Observatory and Telescope Array, as well as from IceCube, Super-Kamiokande, CTA and HAWC, and space-borne experiments such as AMS-02, Fermi and CALET.

Participants discussed how many open questions in high-energy astroparticle physics are related to our understanding of cosmic-ray interactions from the multi-messenger point of view; for example, the relevance of production and propagation of positrons or antimatter for indirect dark-matter searches, or of atmospheric-neutrino production for neutrino oscillations or neutrino astronomy.

Showcasing several models of high-energy cosmic-ray interactions, and their verification by accelerator measurements, was also a highlight of the symposium. The event offered a unique opportunity for developers of major cosmic-ray interaction models to gather and engage in valuable discussions. Other highlights were the talks about accelerator data relevant to cosmic-ray observations, reported by the teams behind CERN’s large LHC experiments as well as smaller fixed-target experiments such as NA61. Emphasis was put on forward measurements by ATLAS, CMS, LHCb and LHCf, including first results from the SMOG gas-jet target measurements of LHCb (see “Fixed-target physics in collider mode at LHCb“).

A public lecture, “Exploring the Invisible Universe” by Nobel Laureate Takaaki Kajita, attracted more than 250 participants, which was complemented by a tour of the nuclear emulsion lab of Nagoya University to see state-of-the-art emulsion technology. The progress in this technology was clearly visible when Edison Shibuya and others recalled the early days of studying cosmic rays with emulsion chambers and Saavedra’s related pioneering contributions.

There were many discussions on future studies of relevance to cosmic-ray interactions and astroparticle physics. Hans Dembinski discussed prospects in the near and far future in collider experiments, including possible proton–oxygen runs at the LHC and a study of multi-particle production at a future circular collider. The cosmic-ray community is very enthusiastic about a future proton–oxygen run since, even with a short run of 100 million events, charged particle and pion spectra could be measured to an accuracy of 10% – a five-fold improvement over current model uncertainties that would bring us a crucial step closer to unveiling the cosmic accelerators of the highest energy particles in the universe.

The next ISVHECRI will be held in June 2020 at Ooty, the location of the RAPES air-shower experiment in India.

AWAKE accelerates electrons in world first

AWAKE spectrometer signal

The AWAKE experiment at CERN has passed an important milestone towards compact, high-energy accelerators for applications in future high-energy physics experiments. Reporting in Nature on 29 August, the 18 institute-strong international collaboration has for the first time demonstrated the acceleration of electrons in a plasma wakefield generated by a proton beam. The AWAKE team injected electrons into plasma at an energy of around 19 MeV and, after travelling a distance of 10 m, the electrons emerged with an energy of about 2 GeV – representing an average acceleration gradient of around 200 MV/m. For comparison, radio-frequency (RF) cavities in high-energy linear accelerators used for X-ray free-electron lasers achieve typical gradients of a few tens of MV/m.

Plasma-wakefield acceleration still has far to go before it can rival the performance of conventional RF technology, however. First proposed in the late 1970s, the technique accelerates charged particles by forcing them to “surf” atop a longitudinal plasma wave that contains regions of positive and negative charges. Two beams are required: a “witness” beam, which is to be accelerated, and a “drive” beam that generates the wakefield. Initial experiments took place with laser and electron drive beams at SLAC and elsewhere in the 1990s, and the advent of high-power lasers as wakefield drivers led to increased activity. Such techniques are now capable of bringing electrons to energies of a few GeV over a distance of a few centimetres.

AWAKE (the Advanced Wakefield Experiment) is a proof-of-principle R&D project that is the first to use protons for the drive beam. Since protons penetrate deeper into the plasma than electrons and lasers, thereby accelerating witness beams for a greater distance, they potentially can accelerate electrons to much higher energies in a single plasma stage. The experiment is driven by a bunch of 400 GeV protons from the Super Proton Synchrotron, which is injected into a plasma cell containing rubidium gas at a temperature of around 200ºC. An accompanying laser pulse is used to ionise the rubidium gas and transform it into a plasma. As the proton bunch travels through the plasma, it splits into a series of smaller bunches via a process called self-modulation, generating a strong wakefield as they move. A bunch of witness electrons is then injected at an angle into this oscillating plasma at relatively low energies and rides the plasma wave to get accelerated. At the other end of the plasma, a dipole magnet bends the incoming electrons onto a scintillator to allow the energy of the outgoing particles to be measured (see figure).

AWAKE has made rapid progress since its inception in 2013. Following the installation of the plasma cell in early 2016, in the tunnel formerly used by part of the CNGS facility at CERN, a proton-driven wakefield in a plasma was observed for the first time by the end of the year (CERN Courier January/February 2017 p8). The electron source, electron beam line and electron spectrometer were installed during 2017, completing the preparatory phase beginning in 2018, and the first electron acceleration was recorded early in the morning of 26 May.

So far, the AWAKE demonstration involves low-intensity electron bunches; the next steps include plans to create an electron beam at high energy with sufficient quality to be useful for applications, although tests will pause at the end of the year when the CERN accelerator complex shuts down for two years for upgrades and maintenance. A first application of AWAKE is to deliver accelerated electrons to an experiment and extending the project with a fully-fledged physics programme of its own. For eventual collider experiments, another hurdle is to be able to accelerate positrons. In the longer term, a global effort is under way to develop wakefield-acceleration techniques for a multi-TeV linear collider (CERN Courier December 2017 p31).

Although still at an early stage of development, the use of plasma wakefields could drastically reduce the size and therefore cost of accelerators. Edda Gschwendtner, technical coordinator and CERN project leader for AWAKE, says that the ultimate aim is to attain an average acceleration gradient of around 1 GV/m so that electrons can be accelerated to the TeV scale in a single stage. “We are looking forward to obtaining more results from our experiment to demonstrate the scope of plasma wakefields as the basis for future particle accelerators.”

ALPHA takes antihydrogen to the next level

Antihydrogen 1S–2P spectral line shape

The ALPHA experiment at CERN’s Antiproton Decelerator (AD) has made yet another seminal measurement of the properties of antiatoms. Following its determination last year of both the ground-state hyperfine and the 1S–2S transitions in antihydrogen, the latter representing the most precise measurement of antimatter ever made (CERN Courier May 2018 p7), the collaboration has reported in Nature the first measurement of the next fundamental energy level: the Lyman-alpha transition. The result demonstrates that ALPHA is quickly and steadily paving the way for precision experiments that could uncover as yet unseen differences between the behaviour of matter and antimatter (CERN Courier March 2018 p30).

The Lyman-alpha (or 1S–2P) transition is one of several in the Lyman series that were discovered in atomic hydrogen just over a century ago. It corresponds to a wavelength of 121.6 nm and is a special transition in astronomy because it allows researchers to probe the state of the intergalactic medium. Finding any slight difference between such transitions in antimatter and matter would shake one of the foundations of quantum field theory, charge–parity–time (CPT) symmetry, and perhaps cast light on the observed cosmic imbalance of matter
and antimatter.

The ALPHA team makes antihydrogen atoms by taking antiprotons from the AD and binding them with positrons from a sodium-22 source, confining the resulting antihydrogen atoms in a magnetic trap. A laser is used to measure the antiatoms’ spectral response, requiring a range of laser frequencies and the ability to count the number of atoms that drop out of the trap as a result of interactions between the laser and the trapped atoms. Having successfully employed this technique to measure the 1S–2S transition, ALPHA has now measured the Lyman-alpha transition frequency with a precision of a few parts in a hundred million: 2,466,051.7 ± 0.12 GHz. The result agrees with the prediction for the equivalent transition hydrogen to a precision of 5 × 10–8.

Although the precision is not as high as that achieved in hydrogen, the finding represents a pivotal technological step towards laser cooling of antihydrogen and the extension of antimatter spectroscopy to quantum states possessing orbital angular momentum. Simulations indicate that cooling to about 20 mK is possible with the current ALPHA set-up, which, combined with other planned improvements, would reduce the 1S–2S transition line width (see figure) by more than an order of magnitude. At such levels of precision, says the team, antihydrogen spectroscopy will have an impact on the determination of fundamental constants, in addition to providing elegant tests of CPT symmetry. Laser cooling will also allow precision tests of the weak equivalence principle via antihydrogen free-fall or antiatom-interferometry experiments.

“The Lyman-alpha transition is notoriously difficult to probe – even in normal hydrogen”, says ALPHA spokesperson Jeffrey Hangst. “But by exploiting our ability to trap and hold large numbers of antihydrogen atoms for several hours, and using a pulsed source of Lyman-alpha laser light, we were able to observe this transition. Next up is laser cooling, which will be a game-changer for precision spectroscopy and gravitational measurements.”

Europe calls for advanced detector and imaging ideas

The European Union (EU) has committed €17 million to help bring a total of 170 breakthrough detection and imaging ideas to market. Led by CERN and funded by the EU’s Horizon 2020 programme, the ATTRACT initiative involves several other European research infrastructures and institutes: the European Molecular Biology Laboratory, European Southern Observatory, European Synchrotron Radiation Facility, European XFEL, Institut Laue-Langevin, Aalto University, the European Industrial Research Management Association (EIRMA) and ESADE. It will focus on the development of new radiation sensor and imaging technologies both for scientific purposes and to address broader challenges in the domains of health, sustainable materials and information, and communication technologies.

Markus Nordberg of the CERN-IPT development and innovation unit laid the foundations for ATTRACT back in 2013, observing then how detector developers found it difficult to find suitable programmes to facilitate the wider use of generic detector R&D. “The detector R&D community, for example regarding the LHC upgrades and beyond, has ideas of the potential suitability of its technologies in other fields, but limited contacts, mechanisms or resources available to follow these ideas further or to make a case,” he says. “ATTRACT builds upon the collaborative spirit of open science and co-innovation, where the experience and available infrastructure at laboratories such as CERN could turn out to be useful.”

The ATTRACT seed fund (www.attract-eu.com) is open to researchers and entrepreneurs from organisations all over Europe. The call for proposals for CERN users and other outside laboratories working on detection and imaging technologies will close on 31 October, and the successful proposals will be announced in early 2019. The 170 projects funded by ATTRACT will have one year to develop their ideas, during which business and innovation experts from Aalto University, EIRMA and ESADE Business School will help project teams transform their technology into products, services, companies and jobs.

US initiative to tackle data demands of HL-LHC

Possible signal

The US National Science Foundation (NSF) has launched a $25 million effort to help tackle the torrent of data from the High-Luminosity Large Hadron Collider (HL-LHC). The Institute for Research and Innovation in Software for High-Energy Physics (IRIS-HEP), announced on 4 September, brings together multidisciplinary teams of researchers and educators from 17 universities in the US. It will receive $5 million per year for a period of five years, with a focus on developing new software tools, algorithms, system designs and training the next generation of users.

Construction for the HL-LHC upgrade is already under way (CERN Courier July/August 2018 p7) and the machine is expected to reach full capability in the mid-2020s. Boosting the LHC’s luminosity by a factor of almost 10, HL-LHC will collect around 25 times more data than the LHC has produced up to now and push data processing and storage to the limit. How to address the immense computing challenges ahead was the subject of a recent community white paper published by the HEP Software Foundation (CERN Courier April 2018 p38).

In 2016, the NSF convened a project to gauge the LHC data challenge, bringing together representatives from the high-energy physics and computer-science communities to review two decades of successful LHC data-processing approaches and discuss ways to address the obstacles that lay ahead. The new software institute emerged from that effort.

The institute is primarily about people, rather than computing hardware, explains IRIS-HEP principal investigator and executive director Peter Elmer of Princeton University, who is also a member of the CMS collaboration. “The institute will be virtual, with a core at Princeton, but coordinated as a single distributed collaborative project involving the participating universities similar to many activities in high-energy physics,” he says. “High-energy physics had a rush of discoveries in the 1960s and 1970s that led to the Standard Model of particle physics, and the Higgs boson was the last missing piece of that puzzle. We are now searching for the next layer of physics beyond the Standard Model. The software institute will be key to getting us there.”

Co-funded by NSF’s Office of Advanced Cyberinfrastructure (OAC) and the NSF division of physics, IRIS-HEP is the third OAC software institute, following the Molecular Sciences Software Institute and the Science Gateways Community Institute.

“Our US colleagues worked with us very closely preparing the community white paper last year, which was then used as one of the significant inputs into the NSF proposal,” says Graeme Stewart of CERN and the HEP Software Foundation. “So we’re really happy about the funding announcement and very much looking forward to working together with them.”

Hyper-Kamiokande construction to start in 2020

Hyper-K’s giant tank

On 12 September, the Japanese government granted seed funding towards the construction of the Hyper-Kamiokande experiment, a next-generation detector for the study of neutrinos. Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) allocated $700,000 within its budget request for the 2019 fiscal year, which will enable progress in preparatory work for construction and efforts to secure international collaboration.

Coinciding with the MEXT announcement, the University of Tokyo pledged to ensure that construction of the Hyper-Kamiokande detector commences in April 2020. According to a statement from university president Makoto Gonokami: “The University of Tokyo has made this decision in recognition of both the project’s importance and value both nationally and internationally. … Seed fundings in the past projects usually lead to full funding in the following year, as was the case for the Super-Kamiokande project.”

Hyper-Kamiokande (Hyper-K) is a water Cherenkov detector centered on a huge underground tank containing 300,000 tonnes of water, with a sensitive volume about a factor of 10 larger than its predecessor Super-Kamiokande (Super-K). Like Super-K, Hyper-K will be located in Kamioka on the west coast of Japan directly in the path of a neutrino beam generated 295 km away at the J-PARC facility in Tokai, allowing it to make high-statistics measurements of neutrino oscillations. Together with a near-detector located close to J-PARC, Super-K formed the “T2K” long-baseline neutrino programme. An order of magnitude bigger than Super-K, Hyper-K will serve as the next far-detector at T2K, with a rich physics portfolio. This ranges from the study of the CP violation in the leptonic sector and measurements of neutrino-mixing parameters, to studies of proton decay, atmospheric neutrinos and neutrinos from astrophysical sources.

It was at Super-K in 1998 that researchers discovered neutrino oscillations, proving that neutrinos are massive and leading to the award of the 2015 Nobel Prize in Physics to Takaaki Kajita of the University of Tokyo and Arthur McDonald of Queen’s University in Canada. The Japanese neutrino programme has progressed steadily since the 1998 discovery (CERN Courier July/August 2016 p29). Hyper-K was discussed as long ago as 2002 and a letter of intent was published in 2011, following the first measurement of the neutrino mixing angle θ13 at T2K, which boosted the expectation of a discovery of leptonic CP violation by Hyper-K. The experiment was placed in Japan’s list of priority projects in 2014 but was not short-listed. The project was proposed again in 2017, this time making the short-list of seven projects to be funded by MEXT. The Hyper-K conceptual design report was published earlier this year (see further reading).

“Hyper-Kamiokande now moves from planning to construction,” said Hyper-K project co-leader Francesca Di Lodovico of Queen Mary University of London, in a statement released by the Kavli Institute for the Physics and Mathematics of the Universe in Japan on behalf of the Hyper-K collaboration. “The collaboration will now work on finalising designs, and is very open to more international partners joining this exciting, far-reaching new experiment.” The Hyper-K proto-collaboration was formed in 2015 and currently comprises around 300 members from 73 institutes in 15 countries. Many European institutes are involved, including the CERN neutrino group, which is already participating in the upgrade of the T2K near detector to serve Hyper-K. To this end, in the summer of
last year a detector called Baby MIND that was designed and built at CERN was shipped to J-PARC (CERN Courier July/August 2017 p12).

“Hyper-K is the next step in the Japanese neutrino adventure,” says Baby MIND spokesperson and Hyper-K collaborator Alain Blondel of the University of Geneva. “This success comes from wise choices and intelligent planning. The increase in the far-detector mass is exciting: demonstration of an asymmetry between neutrinos and antineutrinos was identified as the ‘great discovery’ goal as soon as neutrino oscillations were discovered, although it presents a challenge regarding systematics. And if a proton decay is detected or a supernova strikes, it will be fireworks!”

Survey addresses recognition in large collaborations

Building 40

The European Committee for Future Accelerators (ECFA) has created a working group to examine the recognition of individual achievements in large scientific collaborations. Based on feedback from an initial survey of the leaders of 29 CERN-based or CERN-recognised experiments in particle, nuclear, astroparticle and astrophysics, ECFA found that the community is ready to engage in dialogue on this topic and receptive to potential recommendations.

In response, ECFA has launched a community-wide survey to verify how individual researchers perceive the systems put in place to recognise their achievements. The survey will be distributed widely, and can be found on the ECFA website (https://ecfa.web.cern.ch) with a deadline for responses by 26 October.

The results of the survey will be disseminated and discussed at the upcoming plenary ECFA meeting at CERN on 15–16 November. An open session during the morning of 15 November, also to be webcast, will be devoted to the discussion of the outcomes of the survey, and aims to gather input to be submitted to the update of the European Strategy for Particle Physics (CERN Courier April 2018 p7). During the remaining open sessions, comprehensive overviews of all major future collider projects in and beyond Europe, and related accelerator technologies, will be given.

“Visibility and promotion of young scientists is of utmost importance in science and in particular also for the large collaborations in high-energy physics,” says ECFA chairperson Jorgen D’Hondt. “On the eve of the update process of the European Strategy, it is an outstanding opportunity for ECFA to take on its responsibility for informing the community about the opportunities and challenges ahead of us. Everybody is welcome.”

Thin silicon sharpens STAR imaging

Gold–gold collision

A new technology has enabled the STAR collaboration at Brookhaven National Laboratory’s Relativistic Heavy-Ion Collider (RHIC) to greatly expand its ability to reconstruct short-lived charm hadron decays, even in collisions containing thousands of tracks. A group of STAR collaborators, led by Lawrence Berkeley National Laboratory, used 400 Monolithic Active Pixel Sensor (MAPS) chips in its new vertex detector, called the heavy-flavour tracker (HFT), representing the first application of this technology in a collider experiment.

The HFT reconstructs charmed hadrons over a broad momentum range by identifying their secondary decay vertices, which are a few tens to hundreds of micrometres away from the collision vertex. The charmed hadrons are used to study heavy-quark energy loss in a quark–gluon plasma (QGP) and to determine emergent QGP-medium transport parameters.

The MAPS sensor is based on the same commercial CMOS technology that is widely used in digital cameras. It comprises an array of 928 × 960 square pixels with a pitch of 20.7 × 20.7 μm2 to provide a single-hit resolution of <6 μm. The sensors are thinned to a thickness of 50 μm and mounted on a carbon-fibre mechanical support, and their relatively low power consumption (170 mW/cm2) allows the detector to be air-cooled. The thinness is important to minimise multiple scattering in the HFT, allowing for good pointing resolution even for low transverse-momentum charged tracks.

The heavy-flavour physics programme enabled by the HFT has been one of the driving forces for RHIC runs from 2014 to 2016. The first measurement with the HFT on the D0 elliptic collective flow shows that D0 mesons have significant hydrodynamic flow in gold–gold collisions, and the HFT pointing resolution also enabled the first measurement of charmed-baryon production in heavy-ion collisions.

Building on the success of the STAR HFT, the ALICE collaboration at CERN’s Large Hadron Collider is now building its own MAPS-based vertex detector – the ITS upgrade – and the sPHENIX collaboration at RHIC is also planning a MAPS-based detector. These next-generation detectors will have much faster event readout, by a factor of 20, to reduce event pileup and therefore allow physicists to reconstruct bottom hadrons more efficiently in high-luminosity, heavy-ion collision environments.

bright-rec iop pub iop-science physcis connect