Comsol -leaderboard other pages

Topics

Success in scientific management

Barry Barish

Your co-Nobelists in the discovery of gravitational waves, Kip Thorne and Rainer Weiss, have both recognised your special skills in the management of the LIGO collaboration. When you landed in LIGO in 1994, what was the first thing you changed?

When I arrived in LIGO, there was a lot of dysfunction and people were going after each other. So, the first difficult problem was to make LIGO smaller, not bigger, by moving people out who weren’t going to be able to contribute constructively in the longer term. Then, I started to address what I felt were the technical and management weaknesses. Along with my colleague, Gary Sanders, who had worked with me on one of the would-be detectors for the Superconducting Super Collider (SSC) before the project was cancelled, we started looking for the kind of people that were missing in technical areas.

For example, LIGO relies on very advanced lasers but I was convinced that the laser that was being planned for, a gas laser, was not the best choice because lasers were, and still are, a very fast-moving technology and solid-state lasers were more forward-looking. Coming from particle physics, I’m used to not seeing a beam with my own eyes. So I wasn’t disturbed that the most promising lasers at that time emitted light in the infrared, instead of green, and that technology had advanced to where they could be built in industry. People who worked with interferometers were used to “little optics” on lab benches where the lasers were all green and the alignment of mirrors etc was straightforward. I asked three of the most advanced groups in the world who worked on lasers of the type we needed (Hannover in Germany, Adelaide in Australia and Stanford in California) if they’d like to work together with us, and we brought these experts into LIGO to form the core of what we still have today as our laser group.

Project management for forefront science experiments is very different, and it is hard for people to do it well

This story is mirrored in many of the different technical areas in LIGO. Physics expertise and expertise in the use of interferometer techniques were in good supply in LIGO, so the main challenge was to find expertise to develop the difficult forefront technologies that we were going to depend on to reach our ambitious sensitivity goals. We also needed to strengthen the engineering and project-management areas, but that just required recruiting very good people. Later, the collaboration grew a lot, but mostly on the data-analysis side, which today makes up much of our collaboration.

According to Gary Sanders of SLAC, “efficient management of large science facilities requires experience and skills not usually found in the repertoire of research scientists”. Are you a rare exception?

Gary Sanders was a student of Sam Ting, then he went to Los Alamos where he got a lot of good experience doing project work. For myself, I learned what was needed kind of organically as my own research grew into larger and larger projects. Maybe my personality matched the problem, but I also studied the subject. I know how engineers go about building a bridge, for example, and I could pass an exam in project management. But, project management for forefront science experiments is very different, and it is hard for people to do it well. If you build a bridge, you have a boss, and he or she has three or four people who do tasks under his/her supervision, so generally the way a large project is structured is a big hierarchical organisation. Doing a physics research project is almost the opposite. For large engineering projects, once you’ve built the bridge, it’s a bridge, and you don’t change it. When you build a physics experiment, it usually doesn’t do what you want it to do. You begin with one plan and then you decide to change to another, or even while you’re building it you develop better approaches and technologies that will improve the instruments. To do research in physics, experience tells us that we need a flat, rather than vertical, organisational style. So, you can’t build a complicated, expensive ever-evolving research project using just what’s taught in the project-management books, and you can’t do what’s needed to succeed in cost, schedule, performance, etc, in the style found in a typical physics-department research group. You have to employ some sort of hybrid. Whether it’s LIGO or an LHC experiment, you need to have enough discipline to make sure things are done on time, yet you also need the flexibility and encouragement to change things for the better. In LIGO, we judiciously adapted various project-management formalities, and used them by not interfering any more than necessary with what we do in a research environment. Then, the only problem – but admittedly a big one – is to get the researchers, who don’t like any structure, to buy into this approach.

How did your SSC experience help?

It helped with the political part, not the technical part, because I came to realise how difficult the politics and things outside of a project are. I think almost anything I worked on before has been very hard, because of what it was or because of some politics in doing it, but I didn’t have enormous problems that were totally outside my control, as we had in the SSC.

How did you convince the US government to keep funding LIGO, which has been described as the most costly project in the history of the NSF?

It’s a miracle, because not only was LIGO costly, but we didn’t have much to show in terms of science for more than 20 years. We were funded in 1994, and we made the first detection more than 20 years later. I think the miracle wasn’t me, rather we were in a unique situation in the US. Our funding agency, the NSF, has a different mission than any other agency I know about. In the US, physical sciences are funded by three big agencies. One is the DOE, which has a division that does research in various areas with national labs that have their own structures and missions. The other big agency that does physical science is NASA, and they have the challenge of safety in space. The NSF gets less money than the other two agencies, but it has a mission that I would characterise by one word: science. LIGO has so far seen five different NSF directors, but all of them were prominent scientists. Having the director of the funding agency be someone who understood the potential importance of gravitational waves, maybe not in detail, helped make NSF decide both to take such a big risk on LIGO and then continue supporting it until it succeeded. The NSF leadership understands that risk-taking is integral to making big advancements in science.

What was your role in LIGO apart from management?

I concentrated more on the technical side in LIGO than on data analysis. In LIGO, the analysis challenges are more theoretical than they are in particle physics. What we have to do is compare general relativity with what happens in a real physical phenomenon that produces gravitational waves. That involves more of a mixed problem of developing numerical relativity, as well as sophisticated data-analysis pipelines. Another challenge is the huge amount of data because, unlike at CERN, there are no triggers. We just take data all the time, so sorting through it is the analysis problem. Nevertheless, I’ve always felt and still feel that the real challenge for LIGO is that we are limited by how sensitive we can make the detector, not by how well we can do the data analysis.

What are you doing now in LIGO?

Now that I can do anything I want, I am focusing on something I am interested in and that we don’t employ very much, which is artificial intelligence and machine learning (ML). In LIGO there are several problems that could adapt themselves very well to ML with recent advances. So we built a small group of people, mostly much younger than me, to do ML in LIGO. I recently started teaching at the University of California Riverside, and have started working with young faculty in the university’s computer-science department on adapting some techniques in ML to problems in physics. In LIGO, we have a problem in the data that we call “glitches”, which appear when something that happens in the apparatus or outside world appears in the data. We need to get rid of glitches, and we use a lot of human manpower to make the data clean. This is a problem that should adapt itself very well to a ML analysis.

Now that gravitational waves have joined the era of multi-messenger astronomy, what’s the most exciting thing that can happen next?

For gravitational waves, knowing what discovery you are going to make is almost impossible because it is really a totally new probe of the universe. Nevertheless, there are some known sources that we should be able to see soon, and maybe even will in the present run. So far we’ve seen two sources of gravitational waves: a collision of two black holes and a collision of two neutron stars, but we haven’t yet seen a black hole with a neutron star going around it. They’re particularly interesting scientifically because they contain information about nuclear physics of very compact objects, and because the two objects are very different in mass and that’s very difficult to calculate using numerical relativity. So it’s not just checking off another source that we found, but new areas of gravitational-wave science. Another attractive possibility is to detect a spinning neutron star, a pulsar. This is a continuous signal that is another interesting source which we hope to detect in a short time. Actually, I’m more interested in seeing unanticipated sources where we have no idea what we’re going to see, perhaps phenomena that uniquely happen in gravity alone.

The NSF leadership understands that risk-taking is integral to making big advancements

Will we ever see gravitons?

That’s a really good question because gravitons don’t exist in Einstein’s equations. But that’s not necessarily nature, that’s Einstein’s equations! The biggest problem we have in physics is that we have two fantastic theories. One describes almost anything you can imagine on a large scale, and that’s Einstein’s equations, and the other, which describes almost too well everything you find here at CERN, is the Standard Model, which is based on quantum field theory. Maybe black holes have the feature that they satisfy Einstein’s equations and at the same time conserve quantum numbers and all the things that happen in quantum physics. What we are missing is the experimental clue, whether it’s gravitons or something else that needs to be explained by both these theories. Because theory alone has not been able to bring them together, I think we need experimental information.

Do particle accelerators still have a role in this?

We never know because we don’t know the future, but our best way of understanding what limits our present understanding has been traditional particle accelerators because we have the most control over the particles we’re studying. The unique feature of particle accelerators is that of being able to measure all the parameters of particles that we want. We’ve found the Higgs boson and that’s wonderful, but now we know that the neutrinos also have mass and the Higgs boson possibly doesn’t describe that. We have three families of particles, and a whole set of other very fundamental questions that we have no handle on at all, despite the fact that we have this nice “standard” model. So is it a good reason to go to higher energy or a different kind of accelerator? Absolutely, though it’s a practical question whether it’s doable and affordable.

What’s the current status of gravitational-wave observatories?

We will continue to improve the sensitivity of LIGO and Virgo in incremental steps over the next few years, and LIGO will add a detector in India to give better global coverage. KAGRA in Japan is also expected to come online. But we can already see that
next-generation interferometers will be needed to pursue the science in the future. A good design study, called the Einstein Telescope, has been developed in Europe. In the US we are also looking at next-generation detectors and have different ideas, which is healthy at this point. We are not limited by nature, but by our ability to develop the technologies to make more sensitive interferometers. The next generation of detectors will enable us to reach large red shifts and study gravitational-wave cosmology. We all look forward to exploiting this new area of physics, and I am sure important discoveries will emerge.

David Mark Ritson 1924–2019

David Ritson with Bjørn Wiik

David Mark Ritson, professor emeritus of physics at Stanford University, died peacefully at home on 4 November 2019, just shy of his 95th birthday. He was the last of the leaders of the original seven physics groups formed at SLAC: four of the other leaders were awarded Nobel prizes in physics.

Dave Ritson was born in London and grew up in Hampstead. His ancestors emigrated from Australia, Germany and Lithuania, and his father, a Cambridge alumnus, wrote Helpful Information and Guidance for Every Refugee, distributed in the 1930s and 1940s. Dave won scholarships to Merchant Taylors’ School and to Christ Church, Oxford. His 1948 PhD work included deploying the first high-sensitivity emulsion at the Jungfraujoch research station, and then developing it. Within the data were two particle-physics icons: the whole π → μ → e sequence, and τ-meson decay.

Dave moved to the Dublin IAS, to Rochester and to MIT, doing experiments which helped prove that the s-quark exists. His results were among many that underpinned the “τθ puzzle”, solved by the discovery of parity violation in beta and muon decay. Dave also assisted accelerator physicist Ken Robinson with the proof that stable storage of an electron beam in a synchrotron was possible. In 1961 he and Ferdinando Amman published the equation for disruption caused by colliding e+e beams. “Low beta” collider interaction regions are based on the Amman–Ritson equation.

Dave edited the book Techniques of High Energy Physics, published in 1961, and then took a faculty position in the Stanford physics department – bringing British acuity and economy to the ambitious SLAC team. Between 1964 and 1969, he and Burt Richter submitted four proposals to the US Atomic Energy Commission (AEC) for an e+e collider, all of which were rejected. Dave designed the 1.6 GeV spectro­meter in End Station A to detect proton recoils, which were used to reconstruct “missing mass” and to measure the photoproduction of hard-to-detect bosons.

After 1969 Dave founded Fermilab E-96, the Single Arm Spectrometer Facility, and obtained contributions from many institutions, including Argonne, CERN, Cornell, INFN Bari, MIT and SLAC. It was unusual for accelerator labs to support the fabrication of experiments at other lab’s facilities. Meanwhile, SLAC found internal funding for the SPEAR e+e collider, a stripped-down version of the last proposal rejected by the AEC and led by Richter, driving the epic 1974 c-quark discovery.

Dave returned to SLAC and in 1976 led the formation of the MAC collaboration for SLAC’s new PEP e+e collider. The MAC design of near-hermetic calorimetry with central and toroidal outer spectrometers is now classic. Bill Ford from Colorado used MAC to first observe the long b-quark lifetime. In 1983 Dave led the close-in tracker (vertex detector) project with the first layer only 4.6 cm from the e+e beams, and verified the long b-quark life with reduced errors.

He formally retired in 1987 but was active until 2003 in accelerator design at SLAC, CERN, Fermilab and for the SSC. He helped guide the SLC beams through their non-planar path into collision, and wrote several articles for Nature. He also contributed to the United Nations’ Intergovernmental Panel on Climate Change.

Dave was intensely devoted to his wife Edda, from Marsala, Sicily, who died in 2004, and is survived by their five children.

Vladislav Šimák 1934–2019

Vladislav Šimák

Experimental particle physicist and founder of antiproton physics in Czechoslovakia (later the Czech Republic), Vladislav Šimák, passed away on 26 June 2019. Since the early 1960s his vision and organisational skills helped shape experimental particle physics, not only in Prague, but the whole of the country.

After graduating from Charles University in Prague, he joined the group at the Institute of Physics of the Czechoslovak Academy of Sciences studying cosmic rays using emulsion techniques, earning a PhD in 1963. Though it was difficult to travel abroad at that time, Vlada got a scholarship and went to CERN, where he joined the group led by Bernard French investigating collisions of antiprotons using bubble chambers. It was there and then that his lifelong love affair with antiprotons began. He brought back to Prague film material showing the results of collisions of 5.7 GeV antiprotons and protons from a hydrogen bubble chamber, and formed a group of physicists and technicians, involving many diploma and PhD students who processed them. Vlada also fell in love with the idea of quarks as proposed by Gell-Mann and Zweig, and was the first Czech or Slovak physicist to apply a quark model to pion production in proton–antiproton collisions.

In the early 1970s, when contacts with the West were severely limited, Vlada exploited the experiences he accumulated at CERN and put together a group of Czech and Slovak physicists involved in the processing and analysis of data from proton–antiproton collisions, using the then-highest-energy beam of antiprotons (22.4 GeV) and a hydrogen bubble chamber at the Serpukhov accelerator in Russia. This experiment, which in the later stage provided collisions of antideuterons with protons and deuterons, gave many young physicists the chance to work on unique data for their PhDs and earned Vlada respect in the international community.

After the Velvet Revolution he played a pivotal role in accession to CERN membership

In the late 1980s, when the political atmosphere in Czechoslovakia eased, Vlada together with his PhD student joined the UA2 experiment at CERN’s proton–antiproton collider, where he devoted his attention to jet production. After the Velvet Revolution in November 1989 he played a pivotal role in the decision of the Czech and Slovak particle-physics community to focus on accession to CERN membership.

In 1992 Vlada took Czechoslovak particle physicists into the newly formed ATLAS collaboration, and in 1997 he joined the D0 experiment at Fermilab. He was active in ATLAS until very recently, and in 2014, in acknow­ledgment of his services to physics, the Czech Academy of Sciences awarded Vlada the Ernst Mach Medal for his contributions to the development of physics.

Throughout his life he combined his passion for physics with a love for music, for many years playing the violin in the Academy Chamber Orchestra. For many of us Vlada was a mentor, colleague and friend. We all admired his vitality and enthusiasm for physics, which was contagious. Vlada clearly enjoyed life and we very much enjoyed his company.

He will be sorely missed.

A recipe for sustainable particle physics

The SESAME light source

There has been a marked increase in awareness about climate change in society. Whether due to the recent school strikes initiated by Greta Thunberg or the destructive bushfires gripping Australia, the climate emergency has now moved up in the public’s list of concerns. Governments around the world have put in place various targets to reduce greenhouse-gas emissions as part of the Intergovernmental Panel on Climate Change (IPCC) 2015 Paris agreement. The scientific community, like others, will increasingly be expected to put in place measures to reduce its greenhouse-gas emissions. It is then timely to create structures that will minimise the carbon footprint of current and future experiments, and their researchers.

The LHC uses 1.25 TWh of electricity annually, the equivalent of powering around 300,000 homes, or roughly 2% of the annual consumption of Switzerland. Fortunately, the electricity supply of the LHC comes from France, where only about 10% of electricity is produced by fossil fuels. CERN is adopting several green initiatives. For example, it recently released plans to use hot water from a cooling plant at Point 8 of the LHC (where the LHCb detector is situated) to heat 8000 homes in the nearby town of Ferney-Voltaire. In 2015, CERN introduced an energy-management panel and the laboratory is about to publish a wide-ranging environmental report. CERN is also involved in the biennial workshop series Energy for Sustainable Science at Research Infrastructures, which started in 2011 and is where useful ideas are shared among research infrastructures. Whether it be related to high-performance computing or the LHC’s cryogenic systems, increased energy efficiency both reduces CERN’s carbon footprint and provides financial savings.

It is a moral imperative for the community to look at ways to reduce its carbon footprint

In addition to colliders, particle physics also involves detectors, some of which need particular gases for their operation or cooling. Unfortunately, some of these gases have very high global-warming potential. For example, sulphur hexa­fluoride, which is commonly used in high-voltage supplies and also in certain detectors such as the resistive plate chambers in the ATLAS muon spectrometer, causes 16,000 times more warming than CO2 over a 20-year period. Though mostly used in closed circuits, some of these gases are occasionally vented to the atmosphere or leak from detectors, and, although the quantities involved are small, it is likely that some of the gases used by current detectors are about to be banned by many countries, making them very hard to procure and their price volatile. A lot is already being done to combat this issue. At CERN, for instance, huge efforts have gone into replacing detector cooling fluids and investigating new gas mixtures.

Strategic approach

The European particle-physics community is currently completing the update of its strategy for the next five years or so, which will guide not only CERN activities but also those in all European countries. It is of the utmost importance that sustainability goals be included in this strategy. To this end, myself and my colleagues Cham Ghag and David Waters (University College London) and Francesco Spano (Royal Holloway) arrived at three main recommendations on sustainability as input into the strategy process.

Véronique Boisvert

First, as part of their grant-giving process, European laboratories and funding agencies should include criteria evaluating the energy efficiency and carbon footprint of particle-physics proposals, and should expect to see evidence that energy consumption has been properly estimated and minimised. Second, any design of a major experiment should consider plans for reduction of energy consumption, increased energy efficiency, energy recovery and carbon-offset mechanisms. (Similarly, any design for new buildings should consider the highest energy-efficiency standards.) Third, European laboratories should invest in next-generation digital meeting spaces including virtual-reality tools to minimise the need for frequent travel. Many environmental groups are calling for a frequent-flyer levy, since roughly 15% of the population take about 70% of all flights. This could potentially have a massive effect on the travel budgets of particle physicists, but it is a moral imperative for the community to look at ways to reduce this carbon footprint. Another area that the IPCC has identified will need to undergo a massive change is food. Particle physicists could send a very powerful message by choosing to have all of its work-related catering be mostly vegetarian.

Particle physics is flush with ideas for future accelerators and technologies to probe deeper into the structure of matter. CERN and particle physicists are important role models for all the world’s scientific community. Channelling some of our scientific creativity into addressing the sustainability of our own field, or even finding solutions for climate change, will produce ripples across all of society.

Japanese scientists identify priorities

Illustration of the proposed International Linear Collider.

The International Linear Collider (ILC), currently being considered to be hosted in the Tohoku region of Japan, has not been selected as a high-priority project in the country’s 2020 “master plan” for large research projects. The master plan, which is compiled every three years, was announced on 30 January by the Science Council of Japan (SJC). Among 31 projects which did make it onto high-priority list were the Super-B factory at KEK, the KAGRA gravitational-wave laboratory and an upgrade of the J-PARC facility.

“Even though the ILC did not go into the final shortlist, it was selected as one of the projects that went to the hearing stage indicating that the scientific merit of the ILC was recognized by the committee,” said ILC director Shin Michizono. “This allows the ILC project to move to the next phase.”

In 2012, physicists in Japan submitted a petition to the Japanese government to host the ILC, an electron–positron collider serving as a Higgs-factory. A technical design report was published the following year and, in 2017, the original ILC design was revised to reduce its centre-of-mass energy by half (to 250 GeV), shortening the machine by around a third. In 2018, the International Committee for Future Accelerators (ICFA) issued a statement of support for the project, but in March last year, Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) announced that it has “not yet reached declaration” for hosting the ILC and that the project “requires further discussion in formal academic decision-making processes such as the SCJ master plan”.

The important thing is that discussions on how to share the burden start soon.

Lyn Evans

At a press conference held on 31 January, state minister for MEXT, Koichi Hagiuda, responded positively to the contents of the SJC document. “This has been put together from the viewpoint of people representing the academic community, and we believe that it will serve as a reference for future discussions within the government. Being an international project, the ILC project requires broad support from both inside and outside the country. In light of the outcome of the Master Plan 2020, and observing the progress of other discussions such as the European Strategy for Particle Physics, we would like to carefully carry forward the discussions.”

Member of the Japanese government’s cabinet office, Naokazu Takemoto, who is minister of state for science and technology policy, said: “To put it simply, the project made it through the first round of evaluations, and there were about 60 such projects. In the second round, 31 projects were selected, and the ILC was not among them. However, this is a viewpoint of the Science Council. When considering the possibilities going forward, MEXT will look at high-priority research topics, and I hear that the ILC will be included in the list of these topics.” Responding to a question about the cost of the ILC, Takemoto continued: “The cost is to be shared among many countries, but some say that Japan needs to shoulder most of it. Even if these are the presumptions, I personally think we should strongly ask for realizing the project. It will effectively contribute to regional revitalisation. It will give back hope to people who have suffered greatly by the [damage caused by a tsunami in 2011]. Furthermore, it will give Japan’s technology an advantage to have an important share in the area of the world’s scientific research.”

MEXT representatives are expected to update the community on 20 February during the 85th meeting of ICFA at SLAC National Laboratory in the US.

“It is no surprise that the ILC is not on the SCJ list,” says Lyn Evans, director of the Linear Collider Collaboration.  “It is of a different order of magnitude to any other project the committee considered. It also requires broad international collaboration. The important thing is that discussions on how to share the burden start soon.”

Bad Honnef strategy session concludes

Following a week of discussions, the European Strategy Group has released a statement reporting convergence on recommendations to guide the future of high-energy physics in Europe. The 60-or-so delegates, among them scientific representatives from each of CERN’s member and associate-member states, directors and representatives of major European laboratories and organisations, and invitees from outside Europe, now return home. Their recommendations will be presented to the CERN Council in March and made public at an event in Budapest, Hungary, on 25 May.

Statement from the European Strategy Group after the Bad Honnef drafting meeting, 25 January

The drafting session of the European Strategy Group preparing the next European Particle Physics Strategy Update took place in Bad Honnef (Germany) between 21-25 January 2020. After a week of fruitful discussions involving senior figures of European and international particle physics, convergence was achieved on recommendations that will guide the future of the field.

The drafting session marks a key stage of the strategy update process. The attendees of the Bad Honnef drafting session successfully carried out their ambitious task of identifying a set of priorities and recommendations. They built on the impressive progress made since the last update of the European Strategy for Particle Physics, in 2013, and the rich input received from the entire particle physics community in the current update process.

The next step in this process will be to submit the document outlining the recommendations to the CERN Council. It will be discussed by the Council in March and submitted for final approval at an extraordinary Council Session on 25 May, in Budapest, Hungary. Once approved, it can be made public.

The European Strategy Group

CLIC most flexible option for Europe, study leaders contend

Simulated production of a top-quark pair at a collision energy of 3 TeV at the proposed Compact Linear Collider. Credit: CLIC.

The proposed Compact Linear Collider (CLIC) offers the most flexible option for European particle physics in the post-LHC era, write the leaders of the CLIC study in a preprint posted on arXiv on 15 January. Responding to a preprint by 53 authors in late December which backed a Future Circular Collider (FCC) over CLIC, the CLIC team argues that moving forward quickly with a linear collider “would allow a vibrant high-energy frontier programme to be maintained over the coming decades, while pursuing in parallel the accelerator R&D required to open future options”.

Acknowledging the widespread consensus that the next major collider should be an electron–positron collider to explore the Higgs sector in detail, the authors argue that the discussion of what will be the most appropriate high-energy frontier machine afterwards “must be kept open” such that it can be guided by new physics results and new technology. The three-page long note states that an initial CLIC programme undertaken in parallel with strong accelerator R&D and HL-LHC, followed by the best possible high-energy frontier machine when technologies are mature, “thus provides the most flexible and appealing strategic option” for collider physics in Europe.

Although FCC-ee is unique in offering a very high-statistics Z physics programme, state the CLIC authors, the potential for Higgs-boson studies with a first-stage 380 GeV CLIC or 365 GeV FCC-ee is similar when assuming equivalent running times. They also say that both machines have a similar performance at the top-quark energy and the same accelerator performance risk. “Previous limits of both circular and linear electron–positron colliders have been understood and overcome thanks to vast efforts in hardware developments and large-scale system tests across the planet,” says coauthor Daniel Schulte of CERN. “Both colliders have ambitious parameters, but we are confident that they can be achieved, as confirmed in detailed reviews of both projects.”

Ultimately we all want what is best for our science

Aidan Robson

CLIC studies during the past few years have focused on energy consumption and construction costs, which CLIC project leader Steinar Stapnes of CERN says are now “very favourable” compared with FCC-ee. “Owing to CLIC’s compactness the construction is relatively fast, and we have also deliberately kept the 380 GeV baseline operation time relatively short at eight years,” he says. “We feel strongly that the possibilities for the subsequent step – whether a linear collider energy extension, or a proton or muon collider option – need to be kept on timescales that are not too far away.”

Priorities for European particle physics are under discussion this week at a meeting in Bad Honnef, Germany, as the update of the European strategy for particle physics enters its final stages.

“The European strategy is being developed in a complex environment where particle physics projects continue to become larger and longer-scale,” says Aidan Robson of the University of Glasgow, who is spokesperson of the CLIC detector & physics collaboration. “Ultimately we all want what is best for our science. CLIC at 380 GeV offers a rapid and exciting e+e programme, and opens doors for R&D for several possible future colliders going much higher in energy. This provides the key elements that offer attractive and challenging opportunities for the young people who will drive the future of our field.”

 

Strategy drafting under way in Bad Honnef

Today, senior figures in European particle physics have gathered in the small town of Bad Honnef, Germany, for a week of intense discussions that will guide the future of fundamental exploration. The “strategy drafting session” marks the final stage of the update of the European strategy for particle physics. Convened by the European Strategy Group (ESG) — which includes a scientific delegate from each of CERN’s member and associate-member states, directors and representatives of major European laboratories and organisations and invitees from outside Europe – the 60 or so attendees are tasked with identifying a set of priorities and recommendations to the CERN Council.

The ESG, a special body set up by the CERN Council approximately every five years, was invited to formulate an update of the European strategy for particle physics in September 2017. A call for input in 2018 attracted 160 submissions, which were discussed at an open symposium in Granada, Spain, in May 2019. The ESG then published a 200-page briefing book which distilled the input into an objective scientific summary and will form the basis for discussions in Germany this week.

The start of a new project in the early 2040s is crucial to keep the community motivated and engaged

Fabiola Gianotti

The focus of the latest strategy update, the third since 2005, is which major project should follow the LHC once its high-luminosity phase comes to an end in the late 2030s. There is broad support for an electron—positron collider that will explore the Higgs sector in detail, as well as for a high-energy proton–proton collider at CERN. In Europe, the possible options are the Compact Linear Collider and the Future Circular Collider, while an International Linear Collider (ILC) in Japan and a large Circular Electron-Positron Collider in China are also contenders. The strategy update will also consider non-collider experiments, computing, instrumentation and other key aspects of growing importance to the field such as energy efficiency and communication.

The previous strategy update, which concluded in 2013, made several high-priority recommendations: the full exploitation of the LHC, including the high-luminosity upgrade of the machine and detectors; R&D and design studies for a future energy-frontier machine at CERN; establishing a neutrino programme at CERN for physicists to develop detectors for experiments at accelerator-based neutrino facilities around the world; and the welcoming of a proposal from Japan to discuss the possible participation of Europe in the ILC. The first three are well under way, while a decision on the ILC still rests with the Japanese government. Other conclusions of the 2013 update included the need for closer collaboration with the astroparticle and nuclear physics communities, which has been met for example via the recently launched centre for astroparticle physics theory (EuCAPT) and the new Joint ECFA-NuPECC-APPEC Seminar series, JENAS. There was also a call for greater scientific diversity, leading to the CERN-led Physics Beyond Colliders initiative, which will also form a central part of this week’s discussions.

The recommendations from the ESG are due to formally be approved by the CERN Council on 25 May at an event in Budapest, Hungary.

During her annual address to personnel on 14 January, CERN Director-General Fabiola Gianotti acknowledged the enormous efforts that have gone into the strategy update, and said that she hoped that a recommendation on CERN’s next major collider would be among the ESG’s priorities.

“The start of a new project in the early 2040s is crucial to keep the community motivated and engaged,” said Gianotti, noting that CERN and Europe should also be open to participate in projects at the forefront of particle physics elsewhere in the world. “The Higgs boson is a guaranteed deliverable. It is related to the most obscure and problematic sector of the Standard Model and carries special quantum numbers and a new type of interaction. It is therefore a unique door into new physics, and one that can only be studied at colliders.”

Croatia becomes an associate member of CERN

Vesna Batistic Kos and Fabiola Gianotti

On 10 October CERN welcomed the Republic of Croatia as an Associate Member State, following receipt of official notification that Croatia has completed its internal approval procedures in respect of an agreement signed on 28 February.

“It is a great pleasure to welcome Croatia into the CERN family as an associate member. Croatian scientists have made important contributions to a large variety of experiments at CERN for almost four decades, and as an associate member, new opportunities open up for Croatia in scientific collaboration, technological development, education and training,” said CERN Director-General Fabiola Gianotti.

Researchers from Croatia have contributed to many experiments at CERN, and a cooperation agreement concluded in 2001 increased the country’s participation in CERN’s research and educational programmes. As an Associate Member State, Croatia will be represented at the CERN Council and be entitled to attend meetings of the finance committee and the scientific policy committee. Nationals of Croatia will be eligible to apply for limited-duration positions as staff members and fellows, while firms offering goods and services originating from Croatia will be entitled to bid for CERN contracts, creating opportunities for industrial collaboration in advanced technologies.

Croatia joins India, Lithuania, Pakistan, Turkey and Ukraine as Associate Member States, while Cyprus and Slovenia are Associate Member States in the pre-stage to membership.

2019 Nobel Prize in Physics for cosmic perspectives

James Peebles, Michel Mayor and Didier Queloz

The Nobel Prize in Physics for 2019 has recognised two independent bodies of work that have transformed our view of the universe and humanity’s place in it. One half of the SEK 9 million prize, announced on 8 October in Stockholm, was granted to James Peebles of Princeton University for theoretical discoveries in physical cosmology, while the other was shared between Michel Mayor of the University of Geneva and Didier Queloz of the universities of Geneva and Cambridge for the discovery of an exoplanet orbiting a Sun-like star.

Peebles was instrumental in turning cosmology into the precision science it is today, with its ever closer links to collider and particle physics in general. Following the unexpected discovery of the cosmic microwave background (CMB) in 1965, he and others at Princeton used it to support the idea that the universe began in a hot, dense state. While the idea of a “big bang” was already many years old, Peebles paired it with concrete physics processes such as nucleosynthesis and described the role of temperature and density in the formation of structure. With others, he arrived at a model accounting for the density fluctuations in the CMB showing a series of acoustic peaks, which would demonstrate that the universe is geometrically flat and that ordinary matter constitutes just 5% of its total matter and energy content. In the early 1980s, Peebles was the first to consider non-relativistic “cold” dark matter and its effect on structure formation, and he went on to reintroduce Einstein’s forsaken cosmological constant – work that underpins today’s Lambda Cold Dark Matter model of cosmology.

Mayor and Queloz’s discovery of an exoplanet orbiting a solar-type star in the Milky Way opened a new field of study. 51 Pegasi b lies 50 light years from Earth and takes just four days to complete its orbit. It was spotted by tracking how it and its star orbit around their common centre of gravity: a subtle wobbling seen from Earth whose speed can be measured from the starlight via the Doppler effect. The problem is that the radial velocities are extremely low. Mayor mounted his first spectrograph on a telescope at the Haute-Provence Observatory near Marseille in 1977, but it was only sensitive to velocities above 300 ms–1 – too high to see a planet pulling on its star. It took almost two decades of work by him and his group to strike success, with doctoral student Queloz tasked with developing new methods to increase the machine’s light sensitivity. Today, more than 4000 exoplanets with a vast variety of forms, sizes and orbits have been discovered in our galaxy using the radial-velocity method and the newer technique of transit photometry, challenging ideas about planetary formation.

bright-rec iop pub iop-science physcis connect