Burton Richter, a major figure in particle physics who shared the Nobel Prize for the co-discovery of the J/ψ meson, passed away on 18 July in Palo Alto, California, at the age of 87.
Born in Brooklyn, New York, in 1931, Richter’s love of science began with the nightly blackouts during World War II, which revealed an unparalleled view of the night sky.
He studied physics at the Massachusetts Institute of Technology (MIT), where he was introduced to the electron–positron system by Martin Deutsch, who was conducting classical positronium experiments. He wrote his thesis on the quadratic Zeeman effect in hydrogen and completed his PhD in 1956 on the photoproduction of pi-mesons from hydrogen.
That year, Richter moved to Stanford University’s high-energy physics laboratory as a research associate. In 1960, he became an assistant professor of physics, then associate professor in 1963 and professor in 1967. During this time, Richter married his wife, Laurose, and had two children, Elizabeth and Matthew. By 1970, Richter’s talents in experimental particle physics and accelerator physics led to the Stanford Positron-Electron Asymmetric Ring (SPEAR) at the Stanford Linear Accelerator Center (SLAC). It included a groundbreaking type of general-purpose detector that has been used in particle colliders ever since, and it would eventually produce his biggest discovery.
After Richter secured funding for SPEAR in 1970, it took him just 27 months to build the accelerator, at a cost of $6 million. Experiments commenced in 1973 and, famously, in November 1974, SPEAR flushed out what the SLAC team dubbed the “psi” meson – a bound state of two charm quarks. Simultaneously, at Brookhaven National Laboratory on the other side of the continent, Sam Ting and his group had spotted the same resonance, which they christened the “J”. Just two years later, Richter and Ting shared the 1976 Nobel Prize in Physics for their pioneering discovery of the J/ψ, which proved the existence of a fourth type of quark (charm). It was a major step towards the establishment of the Standard Model of particle physics.
Before he received the Nobel Prize, in 1975 Richter began a sabbatical year at CERN, during which he pursued an experiment at CERN’s Intersecting Storage Rings (ISR) – the world’s first hadron collider. He was hosted by Pierre Darriulat and worked on adding a muon spectrometer arm to the R702 experiment. Richter also worked out the general energy-scaling laws for high-energy electron–positron colliding-beam storage rings, looking specifically at the parameters of a collider with a centre-of-mass energy in the range 100–200 GeV, arguing that such a machine would be required to better understand the relationship between the weak and electromagnetic interactions: “That study turned into the first-order design of the 27 km-circumference LEP project at CERN that was so brilliantly brought into being by the CERN staff in the 1980s,” he wrote in his Nobel biography.
His influential paper “Very High Energy Electron–Positron Colliding Beams for the Study of the Weak Interactions” (Nucl. Instrum. Methods136 47) was followed by two detailed studies: one concerning the physics, published in November 1976 as CERN Yellow Report 76-18, of which Burt was a co-author, and an accelerator study headed by Kjell Johnsen. “Burt’s paper and his personal advocacy of high-energy electron–positron collision triggered interest at CERN, and had a powerful impact on the development of the Laboratory, also paving the way for the LHC and the discovery of the Higgs boson,” says CERN’s John Ellis.
In 1978, along with others at SLAC, Richter began to investigate the possibility of turning the 3.2 km linear accelerator at SLAC into a linear electron–positron collider. Construction of the SLAC Linear Collider (SLC) began in 1983, and Richter became director of SLAC the following year, until stepping down in 1999. During that time, he oversaw the construction of the SLC, the only linear electron–positron collider yet to be built, and led the way to other machines for photon science. While SLAC director, Richter also initiated interregional collaborations with DESY in Germany and KEK in Japan, and was a proponent of bringing into existence a high-energy linear collider as a global collaboration.
“Perhaps his greatest contribution as director was, in the 1990s, designing a future for SLAC that would look very different from the past,” said Stanford Provost Persis Drell, who served as SLAC director from 2007 to 2012. “He recognised that pursuing an X-ray free-electron laser at SLAC could be used to provide a revolutionary science opportunity to the photon science community, who use X-rays as their tool for discovery. This vision became the Linac Coherent Light Source. Burt recognised that outstanding science needed to drive the future of the institution, and he did not flinch from designing that future.”
When he stepped down as SLAC director, Richter focused on public policy issues in science and energy, for which he received the prestigious 2007 Philip Hauge Abelson Prize from the American Association for the Advancement of Science. In 2010, he published Beyond Smoke and Mirrors: Climate Change and Energy in the 21st Century, an apolitical layperson’s exploration of the facts of climate and energy. Among his many accolades, Richter received the US National Medal of Science, the nation’s highest scientific honour, in 2014; the Enrico Fermi Award in 2012; and the Ernest Orlando Lawrence Award in 1976.
“In my career I have met no one who has made more fundamental contributions in electron–positron and electron–electron colliders, in the precision instrumentation used in colliders and in experimental physics,” says Ting. “After we received the Nobel Prize together in 1976, I met him many times and we became good friends. My wife, Susan, and I are going to miss him deeply.”
This year’s Nobel Prize in Physics was shared between three researchers for groundbreaking inventions in laser physics. Half the prize went to Arthur Ashkin of Bell Laboratories in the US for his work on optical tweezers, while the other half was awarded jointly to Gérard Mourou of the École Polytechnique in Palaiseau, France, and Donna Strickland of the University of Waterloo in Canada “for their method of generating high-intensity, ultra-short optical pulses”.
Mourou and Strickland’s technique, called chirped-pulse amplification (CPA), opens new perspectives in particle physics. Proposed in 1985, and forming the foundation of Strickland’s doctoral thesis, CPA uses a strongly dispersive medium to temporally stretch (“chirp”) laser pulses to reduce their peak power, then amplifies and, finally, compresses them – boosting the intensity of the output pulse dramatically without damaging the optical medium. The technique underpins today’s high-power lasers and is used worldwide for applications such as eye surgery and micro-machining.
Surfing the waves
But CPA’s potential for particle physics was clear from the beginning. In particular, high-power ultra-short laser pulses can drive advanced plasma-wakefield accelerators in which charged particles are brought to high energies over very short distances by surfing longitudinal plasma waves.
“After we invented laser-wakefield acceleration back in 1979, I was acutely aware that the laser community at that time did not have the specification that we needed to drive wakefields, which needed ultrafast and ultra-intense pulses,” explains Toshi Tajima of the University of California at Irvine, a long-time collaborator of Mourou. Tajima became aware of CPA in 1989 and first met Mourou in 1993 at a workshop at the University of Texas at Austin devoted to the future of accelerator physics upon the demise of the Superconducting Super Collider. “Ever since then, Gérard and I have formed a strong scientific and personal bond to promote ultra-intense lasers and their applications to accelerators and other important societal applications such as medical accelerators, transmutation and intense X-rays,” he says.
Today, acceleration gradients two-to-three orders of magnitude higher than existing radio-frequency (RF) techniques are possible at state-of-the-art laser-driven plasma-wakefield experiments, promising more compact and potentially cheaper particle accelerators. Though not yet able to match the quality and reliability of conventional acceleration techniques, plasma accelerators might one day be able to overcome the limitations of today’s RF technology, thinks Constantin Haefner, program director for advanced photon technologies at Lawrence Livermore National Laboratory in the US. “The race has started,” he says. “The ability to amplify lasers to extreme powers enabled the discovery of new physics, and even more exciting, some of the early envisioned applications such as laser plasma accelerators are on the verge of moving from proof-of-principle to real machines.”
Electrons can also be used to drive plasma accelerators, as is being explored at SLAC and in European labs such as LNF in Italy and DESY in Germany. Meanwhile, the AWAKE experiment at CERN has recently demonstrated the first proton-driven plasma-wakefield acceleration (CERN Courier October 2018 p7). Although AWAKE does not use a laser to drive the plasma, it employs a high-power laser to generate the plasma from a gas, at the same time seeding the proton self-modulation process that allows charged particles to be accelerated. CERN is also a partner in a recent project called the International Coherent Amplification Network, led by Mourou and funded by the European Union, to explore advanced wakefield drivers based on the coherent combination of multiple high-intensity fibre lasers that can run at high repetition rates and efficiencies.
“We have a long way to go, but plasma accelerators have game-changing potential for high-energy physics,” says Wim Leemans, director of the accelerator technology and applied physics division and Berkeley Lab Laser Accelerator Center (BELLA) at Lawrence Berkeley National Laboratory. “Other applications already being explored include free-electron lasers, a quasi-monoenergetic gamma-ray source for nonproliferation and nuclear security purposes, and a miniaturised method for brachytherapy, a cancer-treatment modality in which radiation is delivered directly to the site of a tumour.”
Beyond accelerators, the enormous intensity of single-shot pulses enabled by CPA offer new types of experiments in high-energy physics. In 2005, Mourou initiated the Extreme Light Infrastructure (ELI), nearing completion in the Czech Republic, Hungary and Romania, to explore the use of high-power PW lasers such as Livermore Lab’s HAPLS facility (see image on previous page). Going beyond ELI is the International Center for Zetta- and Exawatt Science and Technology (IZEST), established in France in 2011 to develop and build a community around the emerging field of laser-based particle physics. Under Mourou and Tajima’s direction, IZEST will extend existing laser facilities (such as PETAL at the Megajoule Laser facility in France) to the exa- and zettawatt scale, opening studies including “searches for dark matter and energy and probes of the nonlinearity of the vacuum via zeptosecond dynamical spectroscopy.”
CHARM, a unique facility at CERN to test electronics in complex radiation environments, has been used to test its first full space system: a micro-satellite called CELESTA, developed by CERN in collaboration with the University of Montpellier and the European Space Agency. Built to monitor radiation levels in low-Earth orbit, CELESTA was successfully tested and qualified during July under a range of radiation conditions that it can be expected to encounter in space. It serves as an important validation of CHARM’s potential value for aerospace applications.
CELESTA’s main goal is to enable a space version of an existing CERN technology called RadMon, which was developed to monitor radiation levels in the Large Hadron Collider (LHC). RadMon also has potential applications in space missions that are sensitive to the radiation environment, ranging from telecom satellites to navigation and Earth-observation systems.
The CELESTA cubesat, a technological demonstrator and educational project made possible with funding from the CERN Knowledge Transfer fund, will play a key role in validating potential space applications by using RadMon sensors to measure radiation levels in low-Earth orbit. An additional goal of CELESTA is to demonstrate that the CHARM facility is capable of reproducing the low-Earth orbit radiation environment. “CHARM benefits from CERN’s unique accelerator facilities and was originally created to answer a specific need for radiation testing of CERN’s electronic equipment,” explains Markus Brugger, deputy head of the engineering department and initiator of both the CHARM and CELESTA projects in the frame of the R2E (Radiation to Electronics) initiative. The radiation field at CHARM is generated through the interaction of a 24 GeV/c proton beam extracted from the Proton Synchrotron with a cylindrical copper or aluminium target. Different shielding configurations and testing positions allow for controlled tests to account for desired particle types, energies and fluences.
It is the use of mixed fields that makes CHARM unique compared to other test facilities, which typically use mono-energetic particle beams or sources. For the latter, only one or a few discrete energies can be tested, which is usually not representative of the authentic and complex radiation environments encountered in aerospace missions. Most testing facilities also use focused beams, limiting tests to individual components, whereas CHARM has a homogenous field extending over an area of least one square metre, which allows complete and complex satellites and other systems to be tested.
CELESTA is now fully calibrated and will be launched as soon as a launch window is provided. When in orbit, in-flight data from CELESTA will be used to validate the CHARM test results for authentic space conditions. “This is a very important milestone for the CELESTA project, as well as an historical validation of the CHARM test facility for satellites,” says Enrico Chesta, CERN’s aerospace applications coordinator.
In late August, a beam of electrons successfully circulated for the first time through a new particle accelerator at Fermilab in the US. The Integrable Optics Test Accelerator (IOTA), a 40 m-circumference storage ring, is one of only a handful of facilities worldwide dedicated to beam-physics studies. It forms the centrepiece of the Fermilab Accelerator Science and Technology (FAST) facility, and is the first research accelerator that will be able to switch between beams of electrons and protons.
Researchers will use IOTA to explore multiple accelerator technologies, including several that have been proposed but never tested, in particular targeting ultrahigh-intensity beams. More fundamentally it will allow precise control of a single electron – also opening the door to unique experiments in fundamental physics, such as understanding how the electron’s quantum-mechanical nature blurs its position in space.
For accelerator physicists, IOTA’s key focus is to test the concept of a nonlinear integrable focusing lattice in a realistic storage ring. Whereas contemporary accelerators are designed with linear focusing lattices, in reality machines always have nonlinearities, e.g. resulting from magnet imperfections, which lead to resonant behaviour and particle losses. A nonlinear integrable focusing lattice, proposed in 2010, is predicted to significantly suppress collective instabilities via Landau damping and thus could improve the performance of accelerators such as a Future Circular Collider. IOTA scientists will also capitalise on Fermilab’s existing strengths in accelerator technologies, such as cooling, to make more orderly beams that are easier to manipulate and accelerate.
Over the next year, the Fermilab team will install the proton injector. Once it is in place, it will complete the trio of particle accelerators that make up Fermilab’s FAST facility: the proton injector, the electron injector (completed in 2017) and the IOTA ring. FAST has already attracted 29 institutional partners, including European institutions, US universities, national laboratories and members from industry.
“IOTA is one of a kind – a particle storage ring designed and built specifically to host novel experiments with both electrons and protons, and to develop innovative concepts in accelerator science,” says Fermilab physicist Alexander Valishev, head of the team that developed and constructed IOTA. “This facility offers a flexibility that can be useful to a wider community – above and beyond the needs of high-energy physics.”
The world’s largest liquid-argon neutrino detector has recorded its first particle tracks in tests at CERN, marking an important step towards the international Deep Underground Neutrino Experiment (DUNE) under preparation in the US. The enormous ProtoDUNE detector, designed and built at CERN’s neutrino platform, is the first of two prototypes for what will be a much larger DUNE detector. Situated deep beneath the Sanford Underground Research Facility in South Dakota, four final DUNE detector modules (each 20 times larger than the current prototypes and containing a total of 70,000 tonnes of liquid argon) will record neutrinos sent from Fermilab’s Long Baseline Neutrino Facility some 1300 km away.
DUNE’s scientific targets include CP violation in the neutrino sector, studies of astrophysical neutrino sources, and searches for proton decay. When neutrinos enter the detector and strike argon nuclei they produce charged particles, which leave ionisation traces in the liquid from which a 3D event can be reconstructed. The first ProtoDUNE detector took two years to build and eight weeks to fill with 800 tonnes of liquid argon, which needs to be cooled to a temperature below –184 degrees. It adopts a single-phase architecture, which is an evolution from the 170 tonne MicroBooNE detector at Fermilab’s short-baseline neutrino facility. The second ProtoDUNE module adopts a different, dual-phase, scheme with a second detection chamber.
The construction and operation of ProtoDUNE will allow researchers to validate the membrane cryostat technology and associated cryogenics for the final detector, in addition to the networking and computing infrastructure. Now that the first tracks have been seen, from beam tests involving cosmic rays and charged-particle beams from CERN’s SPS, ProtoDUNE’s operation will be studied in greater depth. The charged-particle beam test enables critical calibration measurements necessary for precise calorimetry, and will also produce valuable data for optimising event-reconstruction algorithms. These and other measurements will help quantify and reduce systematic uncertainties for the DUNE far detector and significantly improve the physics reach of the experiment. “Seeing the first particle tracks is a major success for the entire DUNE collaboration,” said DUNE co-spokesperson Stefan Soldner-Rembold of the University of Manchester, UK.
More than 1000 scientists and engineers from 32 countries in five continents are working on the development, design and construction of the DUNE detectors. For CERN, it is the first time the European lab has invested in infrastructure and detector development for a particle-physics project in the US. “Only two years ago we completed the new building at CERN to house two large-scale prototype detectors that form the building blocks for DUNE,” said Marzio Nessi, head of the neutrino platform at CERN. “Now we have the first detector taking beautiful data, and the second detector, which uses a different approach to liquid-argon technology, will be online in a few months.”
In July, the US Department of Energy also formally approved PIP-II, an accelerator upgrade project at Fermilab required to deliver the high-power neutrino beam required for DUNE. First data at DUNE is expected in 2026. Meanwhile, in Japan, an experiment with similar scientific goals and also with scientific links to the CERN neutrino platform – Hyper-Kamiokande – has recently been granted seed funding for construction to begin in 2020 (CERN Courier October 2018 p11). Together with several other experiments such as KATRIN in Germany, physicists are closing in on the neutrino’s mysteries two decades after the discovery of neutrino oscillations (CERN Courier July/August 2018 p5).
The 2018 Global Physics Photowalk brought hundreds of amateur and professional photographers to 18 laboratories around the world, including CERN, to capture their scientific facilities and workforce. The science of the participating labs ranges from exploring the origins of the cosmos to understanding our planet’s climate, and from improving human and animal health to helping deliver secure and sustainable food and energy supplies for the future.
Following local competitions, each lab submitted its top three images to the global competition. A public online vote chose the top three from those images, and a jury of expert photographers and scientists also picked their three favourites. The photowalk was organised by the Interactions collaboration, and was supported by the Royal Photographic Society and Association of Science-Technology Centers (ASTC). The winning entries, shown here, were announced on 30 September at the ASTC annual conference in Hartford, Connecticut.
Simon Wright bagged first place in the expert jury’s choice with this shot taken at the UK’s STFC Boulby Underground Laboratory, which is located 1.1 km underground in Europe’s deepest operating mine and contributes to the search for dark matter. The photograph captures STFC’s Tamara Leitan as she scanned an information board at the lab. To highlight Leitan’s face, Wright used a miner’s lamp instead of a flash to minimise interference with light reflected from the safety equipment that workers must wear at the mine.
Simon Wright received another award, this time third prize in the people’s choice category, for this image of green fluorescent lighting at an underground tunnel at the UK’s STFC Chibolton Observatory, which is home to a wide range of science facilities.
Jon McRae took third place in the expert jury’s selection, as well as second place in the people’s choice, for this photo of the DESCANT neutron detector at Canada’s TRIUMF laboratory. The detector can be mounted on the TIGRESS and GRIFFIN experiments to study nuclear structure. Holding a small, spherical lens between the camera and the detector array, McRae recreated a miniature simulacrum of DESCANT in the crystal-clear glass ball.
Stefano Ruzzini won the expert jury’s second prize for this photograph of a silicon-strip particle detector, which was first used in CERN’s NA50 experiment but is now at Italy’s INFN Frascati National Laboratories. The photo was praised by the judges for portraying the three-dimensional aspect of the detector.
This picture from Gianluca Micheletti was also awarded third place in the expert jury’s selection. It shows a researcher observing the XENON1T dark-matter experiment at Italy’s INFN Gran Sasso National Laboratories. The judges commended Micheletti’s composition of the image in evoking the sense of curiosity at the heart of physics.
Luca Riccioni snapped a picture of the KLOE-2 experiment at Italy’s INFN Frascati National Laboratories, which recently concluded its data-taking campaign at the DAΦNE electron–positron collider. The photograph was awarded first place in the people’s choice category.
Is it fun to learn physics from a textbook? According to many teenage participants in CERN’s Beamline for Schools (BL4S) programme, physics lessons at school are much too theoretical. Students from some countries do not even have physics lessons at all, let alone any contact with current science.
Many years back, in 2011, experimental particle physicist Christoph Rembser of CERN had an idea to get high-school students engaged with particle physics by offering them the chance to carry out their own experiment on a CERN beamline. Three years later, the 60th anniversary of CERN in 2014 offered an opportunity for what was meant to be a one-off worldwide science competition: BL4S was born. With the help of media attention in CERN around the time of the anniversary, teams of high-school students and their teachers were invited to propose an experiment at CERN. The response was overwhelming: almost 300 teams involving more than 3000 students from 50 countries submitted a proposal.
When the first two teams came to CERN in September 2014, it was clear that BL4S would not be a one-off event. Clearly the competition had the potential to attract large numbers of high-school students every year to get deeply involved with physics at the crucial stage in their education, two years before leaving school to take up further study. Ashish Tutakne, a member of the 2018 winning team from the Philippines, sums this up: “I believe the experience holds significant weight as it is not only a chance to collaborate with some of the smartest people in the world on a scientific project, it is also a taste of what conducting research is actually like. It is this experience that I believe that will in fact prove valuable to me … throughout the rest [of] my life.”
CERN and society
Thanks to the huge success of the first edition, institutes and foundations around the world also recognised the potential of the competition. Through the CERN & Society Foundation, an independent charitable organisation supported by private donors, BL4S has since been provided with the financial help without which it would not have been possible to turn the competition into an annual event. The CERN & Society Foundation has the aim of spreading CERN’s spirit of scientific curiosity for the benefit of society, and supports young talent through high-quality, hands-on training. This year, for example, in addition to the BL4S initiative, the foundation has helped more than 80 educators participate in CERN’s national teacher programme and granted more than 60 Summer Student scholarships.
So far, more than 900 teams with almost 8500 students from 76 countries have taken part in the BL4S competition, with one third of these students being female. While in the first edition in 2014 about 70% of the teams came from member states of CERN, this year roughly two-thirds of the participating teams were from associate and non-member states. This emphasises the international character of the competition and its global appeal.
The announcement of each edition of BL4S is usually made during the summer the year before, with a deadline for submitting a proposal of up to 1000 words and a one-minute video by 31 March. After about two months of evaluation, involving more than 50 volunteer physicists, the two winning teams and up to 30 shortlisted teams are announced in June. Besides certificates, which every participant receives, the shortlisted teams win special prizes such as BL4S T-shirts for every team member. The two winning teams are finally invited to CERN in September/October for a period of about 12 days to carry out their experiments.
Of course they are not doing this alone, but are guided by two professional scientists. These scientists, typically young PhD students in physics, make the largest contribution to the success of BL4S. They are not only responsible for the fine-tuning and implementation of the experiments of the winning teams but have, in collaboration with the CERN detector workshops, also developed bespoke devices for use in the BL4S experiments. Even though these support scientists were only involved with the project for less than a year, it offered them the opportunity to carry out a complete physics experiment from the beginning to the end; the skills that they acquired helped several of them to find interesting postdoc positions.
Beamline specifics
From the beginning, BL4S attracted a lot of CERN staff members as well as users and even retired staff to make voluntary contributions to the organisation of the event. This involves answering questions from the student teams, evaluating proposals, developing detectors and software, helping the winners with the analysis of the data, and many other things. These volunteers have become a crucial part of the competition.
CERN’s accelerator complex is vast, and is in constant use by thousands of physicists worldwide. Since the first edition, the BL4S experiments have taken place at the T9 beamline of the Proton Synchrotron fixed-target area in the “East Hall” on the main CERN site. This beamline offers a secondary beam with a momentum of 0.5–10 GeV/c and a mixture of electrons, pions, kaons, protons and some muons. Regarding detectors, CERN provides a range of technologies: scintillators, Cherenkov counters, delay wire chambers, multigap resistive plate chambers, micro-mesh gaseous structure detectors, lead-glass calorimeters and Timepix detectors. In addition, students are allowed to build their own detectors and bring them to CERN. For the triggering, NIM modules are used, while the data-acquisition system is based on the RCD-TDAQ system of the ATLAS experiment. The student teams are provided with a detailed document that describes all of these components.
The students are completely free with respect to the experiment and use of these materials as long as it does not raise any safety concerns. Quite often we are surprised by their creativity, and the ten winning proposals from the past five years illustrate the wide spectrum of their ideas (see table above). Besides these winning proposals, all of the proposals received show what captures the attention of curious teenagers. Just a few examples are: the shielding of spacecraft to protect astronauts from the dangers of cosmic radiation; the analysis of the atmosphere with respect to greenhouse gasses; the exploration of natural resources; the creation of artificial aurora borealis; and the artistic translation of signals of elementary particles into sights or sounds.
For a successful participation in BL4S, the role of teachers and other mentors is paramount. Many teachers do not feel confident enough and might not propose their students to take part in BL4S. Partly they feel not qualified enough for such a challenge or they do not get the support from their schools that is necessary to coach a team for many weeks if not months. After all, many teachers are severely limited in the time that they can devote to such activities. In some cases, the students go ahead without any mentors and complete their proposal in a self-directed way. In other cases, they contact physicists at local universities or at one of the national or regional contact points established in almost 30 individual countries. Usually, however, the main burden is on the teacher and we are very grateful to the many teachers who every year dedicate a substantial part of their free time to coach a team of students. Unfortunately, our surveys show that due to the high workload only a few teachers are able to participate several years in a row.
The effect that BL4S has on the many students that are not lucky enough to be invited to CERN is difficult to assess. We know, however, via feedback from several teachers, that BL4S is appreciated as a means of motivating their students. In addition, the students themselves often write that their participation was a great experience for them and many are even motivated to work on their proposals and improve them to take part again in the next edition.
The winning teams are encouraged to stay together after having been at CERN and to write a paper about their experiment. So far, three papers have been published in an international peer-reviewed journal, Physics Education, with the following titles: Building and testing a high school calorimeter at CERN; The secret chambers in the Chephren pyramid; and Testing the validity of the Lorentz factor (see further reading). Papers are typically published one to two years after the completion of the experiment. At least one further paper is currently in the pipeline. This is not a mandatory step for the teams, but it represents a unique opportunity to have authored a scientific publication before even starting at university.
According to a recent survey among the previous winners, most take up studies of natural sciences, engineering or mathematics. Max Raven of the 2016 winning team “Relatively Special” from Colchester Royal Grammar School in the UK remarked: “The most beneficial impact of BL4S has been the strong team-working and communication skills I developed… This invaluable experience has been instrumental to developing my interpersonal skills, which are vital for a successful career in engineering.” After taking part in the BL4S competition Raven was accepted to study engineering by the Massachusetts Institute of Technology.
Students and teachers alike are clearly very happy to be associated with the competition, and this also benefits CERN and its educational aims. Winning BL4S often creates a lot of media attention in the home region of the teams or even at the national level, and recently the two Italian teams that won BL4S in 2015 and 2017 were invited to the ministry of foreign affairs in Rome for a special ceremony. At the same time, BL4S makes a contribution to physics education by leading students into a field of physics rarely touched upon in school curricula. Being able to do hands-on physics with detectors and accelerators used also for other current experiments presents a huge motivation for students to learn even in their free time. Yash Karan, a member of the Philippine winning team in 2018, remarked: “I have learnt much more in the last two weeks at CERN than in the last six months in school!”
Next stop DESY
At the end of this year, CERN’s accelerator complex will be shut down for a period of two years to make way for maintenance and upgrades, in particular for the High-Luminosity LHC. This opens a new chapter in the history of the BL4S competition. In close collaboration with the DESY laboratory in Germany, the competition will continue there in 2019. DESY will provide beam time at the DESY II facility, offering electron and positron beams, and employ a dedicated support scientist on a three-year staff contract. Other institutes such as INFN-Frascati in Italy and the Paul Scherrer Institut in Switzerland are also interested in hosting the competition in the future.
What remains is the never-ending challenge of spreading the word. Even though CERN has many traditional and modern channels of communication, making BL4S known to high-school students and teachers around the world takes the effort of a large number of people at all levels. In particular, volunteers are needed to spread the word in their region and through their available channels, where they play several roles: acting as additional regional contacts for candidate teams; providing coaching if no teacher is available; taking part in the evaluation of proposals; assisting the winning teams with their data analysis and writing of scientific papers; and, finally, finding additional sponsors. Anyone interested can contact the BL4S team via bl4s.team@cern.ch.
As this article went to press, the 2018 winners were completing their experiments, which were hugely successful. All students claimed to have gained an immense increase in knowledge and they admired the passion that surrounded them everywhere they went at CERN. Working together in mixed shift crews each day, the teams have also learned about one another’s experiments, fostering cooperation and personal growth. Quotes such as “Beamline for Schools was a life-changing experience” are not uncommon, and many of this year’s students have made up their minds that they would like to pursue a career in particle physics or engineering.
The registration and proposal-submission for BL4S 2019 are now open. Hopefully the next edition will attract even more students from all around the globe to participate in this unique opportunity.
Although the quark model of hadrons is highly successful in describing how the quarks combine to form baryons and mesons, the internal mechanisms governing the dynamics of the strong force that binds quarks inside those hadrons are far from fully understood. By studying new hadronic resonances and their excited states, light can be shed on these mechanisms.
LHCb physicists have recently observed, for the first time, two new baryons. These states, named Σb(6097)+ and Σb(6097)−, occur as resonances appearing in the two-body system Λb0π±, which consists of a neutral Λb0 baryon and a charged π meson (see figure). The statistical significances of the observations are 12.7σ and 12.6σ, well above the threshold for discovery.
The new particles are members of the Σb family of baryons. Four of the six so-called ground states of this family, the Σb+, Σb–, Σb*+, and Σb*–, were previously discovered by the CDF collaboration at the Tevatron. LHCb also reports a study of the properties of these four ground states, measuring them with unprecedented statistics and improving the precision on their masses and widths by a factor of approximately five.
Establishing precisely how the new Σb(6097)+ and Σb(6097)− states fit into this family is not straightforward. Theoretical predictions for a number of excited Σb states exist, including five Σb(1P) states with expected masses close to the values seen by LHCb – though some of them may be difficult to observe experimentally. Since it’s possible for different excited states to have similar masses, it can’t be excluded that the newly observed mass peaks are actually superpositions of more than one state. Further input from theory, and future experimental studies with more data and in other final states, will help resolve this question.
The meson sector is also capable of providing surprising results. Evidence for another new hadron has recently been reported by LHCb in a Dalitz plot analysis of B0 decays to ηc(1S) K+ π−. A structure, which could be a new resonance in the ηc(1S) π− system, was detected with a significance of more than three standard deviations. While this does not meet the threshold for discovery, it is an intriguing hint and will be pursued with more data. If confirmed, this new Zc(4100)− resonance would be one of a small number of manifestly exotic mesons that cannot be described as a quark–anti-quark pair but must instead have a more complicated structure, such as being a tetraquark combination of two quarks and two antiquarks.
General relativity predicts very accurately how objects fall from a table and how planets move within the solar system. At larger scales, however, some issues arise. The most glaring is the theory’s prediction of the motion of stars within a galaxy and of the acceleration of galaxies away from each other, both of which are at odds with observations. Models containing dark matter and dark energy can solve these two problems, respectively. Another potential solution is that space–time contains additional dimensions, modifying general relativity. Such additional dimensions are not observable with electromagnetic waves, but new information gleaned from gravitational waves (GWs) are allowing such models to be tested for the first time.
Some modifications of general relativity, such as the Dvali–Gabadadze–Porrati (DGP) model, involve the addition of extra dimensions accessible to gravity. If such extra dimensions are large, and thus not rolled up to a microscopic size as predicted by some beyond-Standard Model theories, part of the gravitational field would “leak” into the extra dimensions. Therefore, GWs arriving at detectors such as those of the LIGO and VIRGO observatories would be weaker than expected.
The first GWs detected, in September 2015, came from distant black-hole binaries. For such objects, there is no electromagnetic-wave counterpart, so the only information astronomers have about their distance from Earth is from the GWs themselves, making it impossible to check if some of the wave’s intensity was lost. However, GW170817, the first observed merger of binary neutron stars, produced both GWs and electromagnetic radiation, which was measured by a wide range of instruments (CERN Courier December 2017 p16). As a result, we know in which galaxy the merger took place and therefore have a good measurement of the distance the GWs travelled. Using this distance measurement and the measured strength of the GW signal, one can test whether the signal follows general relativity or a model with additional dimensions.
Doing exactly this, a group led by Kris Pardo from Princeton University has found that the results are most compatible with the standard 3+1 space–time-dimensions picture. Assuming two values for the Hubble constant, as required due to a large discrepancy between values obtained by two different methods (CERN Courier May 2018 p17), the researchers show that, regardless of the value assumed, the results allow for a total of 4.0 ± 0.1 dimensions (see figure).
The authors also obtained an upper limit on the graviton’s lifetime of 450 million years. As is the case with a potential leakage of gravity into extra dimensions, the decay of gravitons propagating towards Earth would also cause the strength of the GW signal to decrease.
These findings are just the beginning of the physics studies made possible by gravitational-wave astronomy. As the authors make clear in their paper, the results only affect theories with finite but large-scale extra dimensions. That may change, however, as more GWs are expected to be measured, with increased precision, in the future. One promising parameter capable of probing a larger set of models is the polarisation of the GWs. For the GW170817 system, polarisation information was not available at the time of observation owing to the limited number of GW detectors. Any higher-dimensional model allows for extra GW polarisation modes, which can be studied with the help of additional GW detectors such as the planned KAGRA and IndIGO facilities.
With a future global array of GW detectors, we can look forward to more studies in this field of physics which, until now, has been almost inaccessible.
What led you to the 1968 paper for which you are most famous?
In the mid-1960s we theorists were stuck in trying to understand the strong interaction. We had an example of a relativistic quantum theory that worked: QED, the theory of interacting electrons and photons, but it looked hopeless to copy that framework for the strong interactions. One reason was the strength of the strong coupling compared to the electromagnetic one. But even more disturbing was that there were so many (and ever growing in number) different species of hadrons that we felt at a loss with field theory – how could we cope with so many different states in a QED-like framework? We now know how to do it and the solution is called quantum chromodynamics (QCD).
But things weren’t so clear back then. The highly non-trivial jump from QED to QCD meant having the guts to write a theory for entities (quarks) that nobody had ever seen experimentally. No one was ready for such a logical jump, so we tried something else: an S-matrix approach. The S-matrix, which relates the initial and final states of a quantum-mechanical process, allows one to directly calculate the probabilities of scattering processes without solving a quantum field theory such as QED. This is why it looked more promising. It was also looking very conventional but, eventually, led to something even more revolutionary than QCD – the idea that hadrons are actually strings.
Is it true that your “eureka” moment was when you came across the Euler beta function in a textbook?
Not at all! I was taking a bottom-up approach to understand the strong interaction. The basic idea was to impose on the S-matrix a property now known as Dolen–Horn–Schmid (DHS) duality. It relates two apparently distinct processes contributing to an elementary reaction, say a+b → c+d. In one process, a+b fuseto form a metastable state (a resonance) which, after a characteristic lifetime, decays into c+d. In the other process the pair a+c exchanges a virtual particle with the pair b+d. In QED these two processes have to be added because they correspond to two distinct Feynman diagrams, while, according to DHS duality, each one provides, for strong interactions, the whole story. I’d heard about DHS duality from Murray Gell-Mann at the Erice summer school in 1967, where he said that DHS would lead to a “cheap bootstrap” for the strong interaction. Hearing this being said by a great physicist motivated me enormously. I was in the middle of my PhD studies at the Weizmann Institute in Israel. Back there in the fall, a collaboration of four people was formed. It consisted of Marco Ademollo, on leave at Harvard from Florence, and of Hector Rubinstein, Miguel Virasoro and myself at the Weizmann Institute. We worked intensively for a period of eight-to-nine months trying to solve the (apparently not so) cheap bootstrap for a particularly convenient reaction. We got very encouraging results hinting, I was feeling, for the existence of a simple exact solution. That solution turned out to be the Euler beta function.
Indeed. The preparatory work done by the four of us had a crucial role, but the discovery that the Euler beta function was an exact realisation of DHS duality was just my own. It was around mid-June 1968, just days before I had to take a boat from Haifa to Venice and then continue to CERN where I would spend the month of July. By that time the group of four was already dispersing (Rubinstein on his way to NYU, Virasoro to Madison, Wisconsin via Argentina, Ademollo back to Florence before a second year at Harvard). I kept working on it by myself, first on the boat, then at CERN until the end of July when, encouraged by Sergio Fubini, I decided to send the preprint to the journal Il Nuovo Cimento.
Was the significance of the result already clear?
Well, the formula had many desirable features, but the reaction of the physics community came to me as a shock. As soon as I had submitted the paper I went on vacation for about four weeks in Italy and did not think much about it. At the end of August 1968, I attended the Vienna conference – one of the biennial Rochester-conference series – and found out, to my surprise, that the paper was already widely known and got mentioned in several summary talks. I had sent the preprint as a contribution and was invited to give a parallel-session talk about it. Curiously, I have no recollection of that event, but my wife remembers me telling her about it. There was even a witness, the late David Olive, who wrote that listening to my talk changed his life. It was an instant hit, because the model answered several questions at once, but it was not at all apparent then that it had anything to do with strings, not to mention quantum gravity.
When was the link to “string theory” made?
The first hints that a physical model for hadrons could underlie my mathematical proposal came after the latter had been properly generalised (to processes involving an arbitrary number of colliding particles) and the whole spectrum of hadrons it implied was unraveled (by Fubini and myself and, independently, by Korkut Bardakci and Stanley Mandelstam). It came out, surprisingly, to closely resemble the exponentially growing (with mass) spectrum postulated almost a decade earlier by CERN theorist Rolf Hagedorn and, at least naively, it implied an absolute upper limit on temperature (the so-called Hagedorn temperature).
The spectrum coincides with that of an infinite set of harmonic oscillators and thus resembles the spectrum of a quantised vibrating string with its infinite number of higher harmonics. Holger Nielsen and Lenny Susskind independently suggested a string (or a rubber-band) picture. But, as usual, the devil was in the details. Around the end of the decade Yoichiro Nambu (and independently Goto) gave the first correct definition of a classical relativistic string, but it took until 1973 for Goddard, Goldstone, Rebbi and Thorn to prove that the correct application of quantum mechanics to the Nambu–Goto string reproduced exactly the above-mentioned generalisation of my original work. This also included certain consistency conditions that had already been found, most notably the existence of a massless spin-1 state (by Virasoro) and the need for extra spatial dimensions (from Lovelace’s work). At that point it became clear that the original model had a clear physical interpretation of hadrons being quantised strings. Some details were obviously wrong: one of the most striking features of strong interactions is their short-range nature, while a massless state produces long-range interactions. The model being inconsistent for three spatial dimensions (our world!) was also embarrassing, but people kept hoping.
So string theory was discovered by accident?
Not really. Qualitatively speaking, however, having found that hadrons are strings was no small achievement for those days. It was not precisely the string we now associate with quark confinement in QCD. Indeed the latter is so complicated that only the most powerful computers could shed some light on it many decades later. A posteriori, the fact that by looking at hadronic phenomena we were driven into discovering string theory was neither a coincidence nor an accident.
When was it clear that strings offer a consistent quantum-gravity theory?
This very bold idea came as early as 1974 from a paper by Joel Scherk and John Schwarz. Confronted with the fact that the massless spin-1 string state refused to become massive (there is no Brout–Englert–Higgs mechanism at hand in string theory!) and that even a massless spin-2 string had to be part of the string spectrum, they argued that those states should be identified with the photon and the graviton, i.e. with the carriers of electromagnetic and gravitational interactions, respectively. Other spin-1 particles could be associated with the gluons of QCD or with the W and Z bosons of the weak interaction. String theory would then become a theory of all interactions, at a deeper, more microscopic level. The characteristic scale of the hadronic string (~10–13 cm) had to be reduced by 20 orders of magnitude (~10–33 cm, the famous Planck-length) to describe the quarks themselves, the electron, the muon and the neutrinos, in fact every elementary particle, as a string.
In addition, it turned out that a serious shortcoming of the old string (namely its “softness”, meaning that string–string collisions cannot produce events with large deflection angles) was a big plus for the Scherk–Schwarz proposal. While the data were showing that hard hadron collisions were occurring at substantial rates, in agreement with QCD predictions, the softness of string theory could free quantum gravity from its problematic ultraviolet divergences – the main obstacle to formulating a consistent quantum-gravity theory.
Did you then divert your attention to string theory?
Not immediately. I was still interested in understanding the strong interactions and worked on several aspects of perturbative and non-perturbative QCD and their supersymmetric generalisations. Most people stayed away from string theory during the 1974–1984 decade. Remember that the Standard Model had just come to life and there was so much to do in order to extract its predictions and test it. I returned to string theory after the Green–Schwarz revolution in 1984. They had discovered a way to reconcile string theory with another fact of nature: the parity violation of weak interactions. This breakthrough put string theory in the hotspot again and since then the number of string-theory aficionados has been steadily growing, particularly within the younger part of the theory community. Several revolutions have followed since then, associated with the names of Witten, Polchinski, Maldacena and many others. It would take too long to do justice to all these beautiful developments. Personally, and very early on, I got interested in applying the new string theory to primordial cosmology.
Was your 1991 paper the first to link string theory with cosmology?
I think there was at least one already, a model by Brandenberger and Vafa trying to explain why our universe has only three large spatial dimensions, but it was certainly among the very first. In 1991, I (and independently Arkadi Tseytlin) realised that the string-cosmology equations, unlike Einstein’s, admit a symmetry (also called, alas, duality!) that connects a decelerating expansion to an accelerating one. That, I thought, could be a natural way to get an inflationary cosmology, which was already known since the 1980s, in string theory without invoking an ad-hoc “inflaton” particle.
The problem was that the decelerating solution had, superficially, a Big Bang singularity in its past, while the (dual) accelerating solution had a singularity in the future. But this was only the case if one neglected effects related to the finite size of the string. Many hints, including the already mentioned upper limit on temperature, suggested that Big Bang-like singularities are not really there in string theory. If so, the two duality-related solutions could be smoothly connected to provide what I dubbed a “pre-Big Bang scenario” characterised by the lack of a beginning of time. I think that the model (further developed with Maurizio Gasperini and by many others) is still alive, at least as long as a primordial B-mode polarisation is not discovered in the cosmic microwave background, since it is predicted to be insignificant in this cosmology.
Did you study other aspects of the new incarnation of string theory?
A second line of string-related research, which I have followed since 1987, concerns the study of thought experiments to understand what string theory can teach us about quantum gravity in the spirit of what people did in the early days of quantum mechanics. In particular, with Daniele Amati and Marcello Ciafaloni first, and then also with many others, I have studied string collisions at trans-Planckian energies (> 1019 GeV) that cannot be reached in human-made accelerators but could have existed in the early universe. I am still working on it. One outcome of that study, which became quite popular, is a generalisation of Heisenberg’s uncertainty principle implying a minimal value of Δx of the order of the string size.
50 years on, is the theory any closer to describing reality?
People say that string theory doesn’t make predictions, but that’s simply not true. It predicts the dimensionality of space, which is the only theory so far to do so, and it also predicts, at tree level (the lowest level of approximation for a quantum-relativistic theory), a whole lot of massless scalars that threaten the equivalence principle (the universality of free-fall), which is by now very well tested. If we could trust this tree-level prediction, string theory would be already falsified. But the same would be true of QCD, since at tree level it implies the existence of free quarks. In other words: the new string theory, just like the old one, can be falsified by large-distance experiments provided we can trust the level of approximation at which it is solved. On the other hand, in order to test string theory at short distance, the best way is through cosmology. Around (i.e. at, before, or soon after) the Big Bang, string theory may have left its imprint on the early universe and its subsequent expansion can bring those to macroscopic scales today.
What do you make of the ongoing debate on the scientific viability of the landscape, or “swamp”, of string-theory solutions?
I am not an expert on this subject but I recently heard (at the Strings 2018 conference in Okinawa, Japan) a talk on the subject by Cumrun Vafa claiming that the KKLT solution [which seeks to account for the anomalously small value of the vacuum energy, as proposed in 2003 by Kallosh, Kachru, Linde and Trivedi] is in the swampland, meaning it’s not viable at a fundamental quantum-gravity level. It was followed by a heated discussion and I cannot judge who is right. I can only add that the absence of a metastable de-Sitter vacuum would favour quintessence models of the kind I investigated with Thibault Damour several years ago and that could imply interestingly small (but perhaps detectable) violations of the equivalence principle.
What’s the perception of strings from outside the community?
Some of the popular coverage of string theory in recent years has been rather superficial. When people say string theory can’t be proved, it is unfair. The usual argument is that you need unconceivably high energies. But, as I have already said, the new incarnation of string theory can be falsified just like its predecessor was; it soon became very clear that QCD was a better theory. Perhaps the same will happen to today’s string theory, but I don’t think there are serious alternatives at the moment. Clearly the enthusiasm of young people is still there. The field is atypically young – the average age of attendees of a string-theory conference is much lower than that for, say, a QCD or electroweak physics conference. What is motivating young theorists? Perhaps the mathematical beauty of string theory, or perhaps the possibility of carrying out many different calculations, publishing them and getting lots of citations.
What advice do you offer young theorists entering the field?
I myself regret that most young string theorists do not address the outstanding physics questions with quantum gravity, such as what’s the fate of the initial singularity of classical cosmology in string theory. These are very hard problems and young people these days cannot afford to spend a couple of years on one such problem without getting out a few papers. When I was young I didn’t care about fashions, I just followed my nose and took risks that eventually paid off. Today it is much harder to do so.
How has theoretical particle physics changed since 1968?
In 1968 we had a lot of data to explain and no good theory for the weak and strong interactions. There was a lot to do and within a few years the Standard Model was built. Today we still have essentially the same Standard Model and we are still waiting for some crisis to come out of the beautiful experiments at CERN and elsewhere. Steven Weinberg used to say that physics thrives on crises. The crises today are more in the domain of cosmology (dark matter, dark energy), the quantum mechanics of black holes and really unifying our understanding of physics at all scales, from the Planck length to our cosmological horizon, two scales that are 60 orders of magnitude apart. Understanding such a hierarchy (together with the much smaller one of the Standard Model) represents, in my opinion, the biggest theoretical challenge for 21st century physics.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.