Comsol -leaderboard other pages

Topics

A word with CERN’s next Director-General

Mark Thomson

What motivates you to be CERN’s next Director-General?

CERN is an incredibly important organisation. I believe my deep passion for particle physics, coupled with the experience I have accumulated in recent years, including leading the Deep Underground Neutrino Experiment, DUNE, through a formative phase, and running the Science and Technology Facilities Council in the UK, has equipped me with the right skill set to lead CERN though a particularly important period.

How would you describe your management style?

That’s a good question. My overarching approach is built around delegating and trusting my team. This has two advantages. First, it builds an empowering culture, which in my experience provides the right environment for people to thrive. Second, it frees me up to focus on strategic planning and engagement with numerous key stakeholders. I like to focus on transparency and openness, to build trust both internally and externally.

How will you spend your familiarisation year before you take over in 2026?

First, by getting a deep understanding of CERN “from within”, to plan how I want to approach my mandate. Second, by lending my voice to the scientific discussion that will underpin the third update to the European strategy for particle physics. The European strategy process is a key opportunity for the particle-physics community to provide genuine bottom-up input and shape the future. This is going to be a really varied and exciting year.

What open question in fundamental physics would you most like to see answered in your lifetime?

I am going to have to pick two. I would really like to understand the nature of dark matter. There are a wide range of possibilities, and we are addressing this question from multiple angles; the search for dark matter is an area where the collider and non-collider experiments can both contribute enormously. The second question is the nature of the Higgs field. The Higgs boson is just so different from anything else we’ve ever seen. It’s not just unique – it’s unique and very strange. There are just so many deep questions, such as whether it is fundamental or composite. I am confident that we will make progress in the coming years. I believe the High-Luminosity LHC will be able to make meaningful measurements of the self-coupling at the heart of the Higgs potential. If you’d asked me five years ago whether this was possible, I would have been doubtful. But today I am very optimistic because of the rapid progress with advanced analysis techniques being developed by the brilliant scientists on the LHC experiments.

What areas of R&D are most in need of innovation to meet our science goals?

Artificial intelligence is changing how we look at data in all areas of science. Particle physics is the ideal testing ground for artificial intelligence, because our data is complex there are none of the issues around the sensitive nature of the data that exist in other fields. Complex multidimensional datasets are where you’ll benefit the most from artificial intelligence. I’m also excited by the emergence of new quantum technologies, which will open up fresh opportunities for our detector systems and also new ways of doing experiments in fundamental physics. We’ve only scratched the surface of what can be achieved with entangled quantum systems.

How about in accelerator R&D?

There are two areas that I would like to highlight: making our current technologies more sustainable, and the development of high-field magnets based on high-temperature superconductivity. This connects to the question of innovation more broadly. To quote one example among many, high-temperature superconducting magnets are likely to be an important component of fusion reactors just as much as particle accelerators, making this a very exciting area where CERN can deploy its engineering expertise and really push that programme forward. That’s not just a benefit for particle physics, but a benefit for wider society.

How has CERN changed since you were a fellow back in 1994?

The biggest change is that the collider experiments are larger and more complex, and the scientific and technical skills required have become more specialised. When I first came to CERN, I worked on the OPAL experiment at LEP – a collaboration of less than 400 people. Everybody knew everybody, and it was relatively easy to understand the science of the whole experiment.

My overarching approach is built around delegating and trusting my team

But I don’t think the scientific culture of CERN and the particle-physics community has changed much. When I visit CERN and meet with the younger scientists, I see the same levels of excitement and enthusiasm. People are driven by the wonderful mission of discovery. When planning the future, we need to ensure that early-career researchers can see a clear way forward with opportunities in all periods of their career. This is essential for the long-term health of particle physics. Today we have an amazing machine that’s running beautifully: the LHC. I also don’t think it is possible to overstate the excitement of the High-Luminosity LHC. So there’s a clear and exciting future out to the early 2040s for today’s early-career researchers. The question is what happens beyond that? This is one reason to ensure that there is not a large gap between the end of the High-Luminosity LHC and the start of whatever comes next.

Should the world be aligning on a single project?

Given the increasing scale of investment, we do have to focus as a global community, but that doesn’t necessarily mean a single project. We saw something similar about 10 years ago when the global neutrino community decided to focus its efforts on two complementary long-baseline projects, DUNE and Hyper-Kamiokande. From the perspective of today’s European strategy, the Future Circular Collider (FCC) is an extremely appealing project that would map out an exciting future for CERN for many decades. I think we’ll see this come through strongly in an open and science-driven European strategy process.

How do you see the scientific case for the FCC?

For me, there are two key points. First, gaining a deep understanding of the Higgs boson is the natural next step in our field. We have discovered something truly unique, and we should now explore its properties to gain deeper insights into fundamental physics. Scientifically, the FCC provides everything you want from a Higgs factory, both in terms of luminosity and the opportunity to support multiple experiments.

Second, investment in the FCC tunnel will provide a route to hadron–hadron collisions at the 100 TeV scale. I find it difficult to foresee a future where we will not want this capability.

These two aspects make the FCC a very attractive proposition.

How successful do you believe particle physics is in communicating science and societal impacts to the public and to policymakers?

I think we communicate science well. After all, we’ve got a great story. People get the idea that we work to understand the universe at its most basic level. It’s a simple and profound message.

Going beyond the science, the way we communicate the wider industrial and societal impact is probably equally important. Here we also have a good story. In our experiments we are always pushing beyond the limits of current technology, doing things that have not been done before. The technologies we develop to do this almost always find their way back into something that will have wider applications. Of course, when we start, we don’t know what the impact will be. That’s the strength and beauty of pushing the boundaries of technology for science.

Would the FCC give a strong return on investment to the member states?

Absolutely. Part of the return is the science, part is the investment in technology, and we should not underestimate the importance of the training opportunities for young people across Europe. CERN provides such an amazing and inspiring environment for young people. The scale of the FCC will provide a huge number of opportunities for young scientists and engineers.

We need to ensure that early-career researchers can see a clear way forward with opportunities in all periods of their career. This is essential for the long-term health of particle physics

In terms of technology development, the detectors for the electron–positron collider will provide an opportunity for pushing forward and deploying new, advanced technologies to deliver the precision required for the science programme. In parallel, the development of the magnet technologies for the future hadron collider will be really exciting, particularly the potential use of high-temperature superconductors, as I said before.

It is always difficult to predict the specific “return on investment” on the technologies for big scientific research infrastructure. Part of this challenge is that some of that benefits might be 20, 30, 40 years down the line. Nevertheless, every retrospective that has tried, has demonstrated that you get a huge downstream benefit.

Do we reward technical innovation well enough in high-energy physics?

There needs to be a bit of a culture shift within our community. Engineering and technology innovation are critical to the future of science and critical to the prosperity of Europe. We should be striving to reward individuals working in these areas.

Should the field make it more flexible for physicists and engineers to work in industry and return to the field having worked there?

This is an important question. I actually think things are changing. The fluidity between academia and industry is increasing in both directions. For example, an early-career researcher in particle physics with a background in deep artificial-intelligence techniques is valued incredibly highly by industry. It also works the other way around, and I experienced this myself in my career when one of my post-doctoral researchers joined from an industry background after a PhD in particle physics. The software skills they picked up from industry were incredibly impactful.

I don’t think there is much we need to do to directly increase flexibility – it’s more about culture change, to recognise that fluidity between industry and academia is important and beneficial. Career trajectories are evolving across many sectors. People move around much more than they did in the past.

Does CERN have a future as a global laboratory?

CERN already is a global laboratory. The amazing range of nationalities working here is both inspiring and a huge benefit to CERN.

How can we open up opportunities in low- and middle-income countries?

I am really passionate about the importance of diversity in all its forms and this includes national and regional inclusivity. It is an agenda that I pursued in my last two positions. At the Deep Underground Neutrino Experiment, I was really keen to engage the scientific community from Latin America, and I believe this has been mutually beneficial. At STFC, we used physics as a way to provide opportunities for people across Africa to gain high-tech skills. Going beyond the training, one of the challenges is to ensure that people use these skills in their home nations. Otherwise, you’re not really helping low- and middle-income countries to develop.

What message would you like to leave with readers?

That we have really only just started the LHC programme. With more than a factor of 10 increase in data to come, coupled with new data tools and upgraded detectors, the High-Luminosity LHC represents a major opportunity for a new discovery. Its nature could be a complete surprise. That’s the whole point of exploring the unknown: you don’t know what’s out there. This alone is incredibly exciting, and it is just a part of CERN’s amazing future.

Charm and synthesis

In 1955, after a year of graduate study at Harvard, I joined a group of a dozen or so students committed to studying elementary particle theory. We approached Julian Schwinger, one of the founders of quantum electrodynamics, hoping to become his thesis students – and we all did.

Schwinger lined us up in his office, and spent several hours assigning thesis subjects. It was a remarkable performance. I was the last in line. Having run out of well-defined thesis problems, he explained to me that weak and electromagnetic interactions share two remarkable features: both are vectorial and both display aspects of universality. Schwinger suggested that I create a unified theory of the two interactions – an electroweak synthesis. How I was to do this he did not say, aside from slyly hinting at the Yang–Mills gauge theory.

By the summer of 1958, I had convinced myself that weak and electromagnetic interactions might be described by a badly broken gauge theory, and Schwinger that I deserved a PhD. I had hoped to partly spend a postdoctoral fellowship in Moscow at the invitation of the recent Russian Nobel laureate Igor Tamm, and sought to visit Niels Bohr’s institute in Copenhagen while awaiting my Soviet visa. With Bohr’s enthusiastic consent, I boarded the SS Île de France with my friend Jack Schnepps. Following a memorable and luxurious crossing – one of the great ship’s last – Jack drove south to Padova to work with Milla Baldo-Ceolin’s emulsion group in Padova, and I took the slow train north to Copenhagen. Thankfully, my Soviet visa never arrived. I found the SU(2) × U(1) structure of the electroweak model in the spring of 1960 at Bohr’s famous institute at Blegsdamvej 19, and wrote the paper that would earn my share of the 1979 Nobel Prize.

We called the new quark flavour charm, completing two weak doublets of quarks to match two weak doublets of leptons, and establishing lepton–quark symmetry, which holds to this day

A year earlier, in 1959, Augusto Gamba, Bob Marshak and Susumo Okubo had proposed lepton–hadron symmetry, which regarded protons, neutrons and lambda hyperons as the building blocks of all hadrons, to match the three known leptons at the time: neutrinos, electrons and muons. The idea was falsified by the discovery of a second neutrino in 1962, and superseded in 1964 by the invention of fractionally charged hadron constituents, first by George Zweig and André Petermann, and then decisively by Murray Gell-Mann with his three flavours of quarks. Later in 1964, while on sabbatical in Copenhagen, James Bjorken and I realised that lepton–hadron symmetry could be revived simply by adding a fourth quark flavour to Gell-Mann’s three. We called the new quark flavour “charm”, completing two weak doublets of quarks to match two weak doublets of leptons, and establishing lepton–quark symmetry, which holds to this day.

Annus mirabilis

1964 was a remarkable year. In addition to the invention of quarks, Nick Samios spotted the triply strange Ω baryon, and Oscar Greenberg devised what became the critical notion of colour. Arno Penzias and Robert Wilson stumbled on the cosmic microwave background radiation. James Cronin, Val Fitch and others discovered CP violation. Robert Brout, François Englert, Peter Higgs and others invented spontaneously broken non-Abelian gauge theories. And to top off the year, Abdus Salam rediscovered and published my SU(2) × U(1) model, after I had more-or-less abandoned electroweak thoughts due to four seemingly intractable problems.

Four intractable problems of early 1964

How could the W and Z bosons acquire masses while leaving the photon massless?

Steven Weinberg, my friend from both high-school and college, brilliantly solved this problem in 1967 by subjecting the electroweak gauge group to spontaneous symmetry breaking, initiating the half-century-long search for the Higgs boson. Salam published the same solution in 1968.

How could an electroweak model of leptons be extended to describe the weak interactions of hadrons?

John Iliopoulos, Luciano Maiani and I solved this problem in 1970 by introducing charm and quark-lepton symmetry to avoid unobserved strangeness-changing neutral currents.

Was the spontaneously broken electroweak gauge model mathematically consistent?

Gerard ’t Hooft announced in 1971 that he had proven Steven Weinberg’s electroweak model to be renormalisable. In 1972, Claude Bouchiat, John Iliopoulos and Philippe Meyer demonstrated the electroweak model to be free of Adler anomalies provided that lepton–quark symmetry is maintained.

Could the electroweak model describe CP violation without invoking additional spinless fields?

In 1973, Makoto Kobayashi and Toshihide Maskawa showed that the electroweak model could easily and naturally violate CP if there are more than four quark flavours.

Much to my surprise and delight, all of them would be solved within just a few years, with the last theoretical obstacle removed by Makoto Kobayashi and Toshihide Maskawa in 1973 (see “Four intractable problems” panel). A few months later, Paul Musset announced that CERN’s Gargamelle detector had won the race to detect weak neutral-current interactions, giving the electroweak model the status of a predictive theory. Remarkably, the year had begun with Gell-Mann, Harald Fritzsch and Heinrich Leutwyler proposing QCD, and David Gross, Frank Wilczek and David Politzer showing it to be asymptotically free. The Standard Model of particle physics was born.

Charmed findings

But where were the charmed quarks? Early on Monday morning on 11 November, 1974, I was awakened by a phone call from Sam Ting, who asked me to come to his MIT office as soon as possible. He and Ulrich Becker were waiting for me impatiently. They showed me an amazingly sharp resonance. Could it be a vector meson like the ρ or ω and be so narrow, or was it something quite different? I hopped in my car and drove to Harvard, where my colleagues Alvaro de Rújula and Howard Georgi excitedly regaled me about the Californian side of the story. A few days later, experimenters in Frascati confirmed the BNL–SLAC discovery, and de Rújula and I submitted our paper “Is Bound Charm Found?” – one of two papers on the J/ψ discovery printed in Physical Review Letters on 5 July 1965 that would prove to be correct. Among five false papers was one written by my beloved mentor, Julian Schwinger.

Sam Ting at CERN in 1976

The second correct paper was by Tom Appelquist and David Politzer. Well before that November, they had realised (without publishing) that bound states of a charmed quark and its antiquark lying below the charm threshold would be exceptionally narrow due the asymptotic freedom of QCD. De Rújula suggested to them that such a system be called charmonium in an analogy with positronium. His term made it into the dictionary. Shortly afterward, the 1976 Nobel Prize in Physics was jointly awarded to Burton Richter and Sam Ting for “their pioneering work in the discovery of a heavy elementary particle of a new kind” – evidence that charm was not yet a universally accepted explanation. Over the next few years, experimenters worked hard to confirm the predictions of theorists at Harvard and Cornell by detecting and measuring the masses, spins and transitions among the eight sub-threshold charmonium states. Later on, they would do the same for 14 relatively narrow states of bottomonium.

Abdus Salam, Tom Ball and Paul Musset

Other experimenters were searching for particles containing just one charmed quark or antiquark. In our 1975 paper “Hadron Masses in a Gauge Theory”, de Rújula, Georgi and I included predictions of the masses of several not-yet-discovered charmed mesons and baryons. The first claim to have detected charmed particles was made in 1975 by Robert Palmer and Nick Samios at Brookhaven, again with a bubble-chamber event. It seemed to show a cascade decay process in which one charmed baryon decays into another charmed baryon, which itself decays. The measured masses of both of the charmed baryons were in excellent agreement with our predictions. Though the claim was not widely accepted, I believe to this day that Samios and Palmer were the first to detect charmed particles.

Sheldon Glashow and Steven Weinberg

The SLAC electron–positron collider, operating well above charm threshold, was certainly producing charmed particles copiously. Why were they not being detected? I recall attending a conference in Wisconsin that was largely dedicated to this question. On the flight home, I met my old friend Gerson Goldhaber, who had been struggling unsuccessfully to find them. I think I convinced him to try a bit harder. A couple of weeks later in 1976, Goldhaber and François Pierre succeeded. My role in charm physics had come to a happy ending. 

  • This article is adapted from a presentation given at the Institute of High-Energy Physics in Beijing on 20 October 2024 to celebrate the 50th anniversary of the discovery of the J/ψ.

CLOUD explains Amazon aerosols

In a paper published in the journal Nature, the CLOUD collaboration at CERN has revealed a new source of atmospheric aerosol particles that could help scientists to refine climate models.

Aerosols are microscopic particles suspended in the atmosphere that arise from both natural sources and human activities. They play an important role in Earth’s climate system because they seed clouds and influence their reflectivity and coverage. Most aerosols arise from the spontaneous condensation of molecules that are present in the atmosphere only in minute concentrations. However, the vapours responsible for their formation are not well understood, particularly in the remote upper troposphere.

The CLOUD (Cosmics Leaving Outdoor Droplets) experiment at CERN is designed to investigate the formation and growth of atmospheric aerosol particles in a controlled laboratory environment. CLOUD comprises a 26 m3 ultra-clean chamber and a suite of advanced instruments that continuously analyse its contents. The chamber contains a precisely selected mixture of gases under atmospheric conditions, into which beams of charged pions are fired from CERN’s Proton Synchrotron to mimic the influence of galactic cosmic rays.

“Large concentrations of aerosol particles have been observed high over the Amazon rainforest for the past 20 years, but their source has remained a puzzle until now,” says CLOUD spokesperson Jasper Kirkby. “Our latest study shows that the source is isoprene emitted by the rainforest and lofted in deep convective clouds to high altitudes, where it is oxidised to form highly condensable vapours. Isoprene represents a vast source of biogenic particles in both the present-day and pre-industrial atmospheres that is currently missing in atmospheric chemistry and climate models.”

Isoprene is a hydrocarbon containing five carbon atoms and eight hydrogen atoms. It is emitted by broad-leaved trees and other vegetation and is the most abundant non-methane hydrocarbon released into the atmosphere. Until now, isoprene’s ability to form new particles has been considered negligible.

Seeding clouds

The CLOUD results change this picture. By studying the reaction of hydroxyl radicals with isoprene at upper tropospheric temperatures of –30 °C and –50 °C, the collaboration discovered that isoprene oxidation products form copious particles at ambient isoprene concentrations. This new source of aerosol particles does not require any additional vapours. However, when minute concentrations of sulphuric acid or iodine oxoacids were introduced into the CLOUD chamber, a 100-fold increase in aerosol formation rate was observed. Although sulphuric acid derives mainly from anthropogenic sulphur dioxide emissions, the acid concentrations used in CLOUD can also arise from natural sources.

In addition, the team found that isoprene oxidation products drive rapid growth of particles to sizes at which they can seed clouds and influence the climate – a behaviour that persists in the presence of nitrogen oxides produced by lightning at upper-tropospheric concentrations. After continued growth and descent to lower altitudes, these particles may provide a globally important source for seeding shallow continental and marine clouds, which influence Earth’s radiative balance – the amount of incoming solar radiation compared to outgoing longwave radiation (see “Seeding clouds” figure).

“This new source of biogenic particles in the upper troposphere may impact estimates of Earth’s climate sensitivity, since it implies that more aerosol particles were produced in the pristine pre-industrial atmosphere than previously thought,” adds Kirkby. “However, until our findings have been evaluated in global climate models, it’s not possible to quantify the effect.”

The CLOUD findings are consistent with aircraft observations over the Amazon, as reported in an accompanying paper in the same issue of Nature. Together, the two papers provide a compelling picture of the importance of isoprene-driven aerosol formation and its relevance for the atmosphere.

Since it began operation in 2009, the CLOUD experiment has unearthed several mechanisms by which aerosol particles form and grow in different regions of Earth’s atmosphere. “In addition to helping climate researchers understand the critical role of aerosols in Earth’s climate, the new CLOUD result demonstrates the rich diversity of CERN’s scientific programme and the power of accelerator-based science to address societal challenges,” says CERN Director for Research and Computing, Joachim Mnich.

Emphasising the free circulation of scientists

Physics is a universal language that unites scientists worldwide. No event illustrates this more vividly than the general assembly of the International Union of Pure and Applied Physics (IUPAP). The 33rd assembly convened 100 delegates representing territories around the world in Haikou, China, from 10 to 14 October 2024. Amid today’s polarised global landscape, one clear commitment emerged: to uphold the universality of science and ensure the free movement of scientists.

IUPAP was established in 1922 in the aftermath of World War I to coordinate international efforts in physics. Its logo is recognisable from conferences and proceedings, but its mission is less widely understood. IUPAP is the only worldwide organisation dedicated to the advancement of all fields of physics. Its goals include promoting global development and cooperation in physics by sponsoring international meetings; strengthening physics education, especially in developing countries; increasing diversity and inclusion in physics; advancing the participation and recognition of women and of people from under-represented groups; enhancing the visibility of early-career talents; and promoting international agreements on symbols, units, nomenclature and standards. At the 33rd assembly, 300 physicists were elected to the executive council and specialised commissions for a period of three years.

Global scientific initiatives were highlighted, including the International Year of Quantum Science and Technology (IYQ2025) and the International Decade on Science for Sustainable Development (IDSSD) from 2024 to 2033, which was adopted by the United Nations General Assembly in August 2023. A key session addressed the importance of industry partnerships, with delegates exploring strategies to engage companies in IYQ2025 and IDSSD to further IUPAP’s mission of using physics to drive societal progress. Nobel laureate Giorgio Parisi discussed the role of physics in promoting a sustainable future, and public lectures by fellow laureates Barry Barish, Takaaki Kajita and Samuel Ting filled the 1820-seat Oriental Universal Theater with enthusiastic students.

A key focus of the meeting was visa-related issues affecting international conferences. Delegates reaffirmed the union’s commitment to scientists’ freedom of movement. IUPAP stands against any discrimination in physics and will continue to sponsor events only in locations that uphold this value – a stance that is orthogonal to the policy of countries imposing sanctions on scientists affiliated with specific institutions.

A joint session with the fall meeting of the Chinese Physical Society celebrated the 25th anniversary of the IUPAP working group “Women in Physics” and emphasised diversity, equity and inclusion in the field. Since 2002, IUPAP has established precise guidelines for the sponsorship of conferences to ensure that women are fairly represented among participants, speakers and committee members, and has actively monitored the data ever since. This has contributed to a significant change in the participation of women in IUPAP-sponsored conferences. IUPAP is now building on this still-necessary work on gender by focusing on discrimination on the grounds of disability and ethnicity.

The closing ceremony brought together the themes of continuity and change. Incoming president Silvina Ponce Dawson (University of Buenos Aires) and president-designate Sunil Gupta (Tata Institute) outlined their joint commitment to maintaining an open dialogue among all physicists in an increasingly fragmented world, and to promoting physics as an essential tool for development and sustainability. Outgoing leaders Michel Spiro (CNRS) and Bruce McKellar (University of Melbourne) were honoured for their contributions, and the ceremonial handover symbolised a smooth transition of leadership.

As the general assembly concluded, there was a palpable sense of momentum. From strategic modernisation to deeper engagement with global issues, IUPAP is well-positioned to make physics more relevant and accessible. The resounding message was one of unity and purpose: the physics community is dedicated to leveraging science for a brighter, more sustainable future.

The new hackerpreneur

The World Wide Web, AI and quantum computing – what do these technologies have in common? They all started out as “hacks”, says Jiannan Zhang, founder of the open-source community platform DoraHacks. “When the Web was invented at CERN, it demonstrated that in order to fundamentally change how people live and work, you have to think of new ways to use existing technology,” says Zhang. “Progress cannot be made if you always start from scratch. That’s what hackathons are for.”

Ten years ago, Zhang helped organise the first CERN Webfest, a hackathon that explores creative uses of technology for science and society. Webfest helped Zhang develop his coding skills and knowledge of physics by applying it to something beyond his own discipline. He also made long-lasting connections with teammates, who were from different academic backgrounds and all over the world. After participating in more hackathons, Zhang’s growing “hacker spirit” inspired him to start his own company. In 2024 Zhang returned to Webfest not as a participant, but as the CEO of DoraHacks.

Hackathons are social coding events often spanning multiple days. They are inclusive and open – no academic institution or corporate backing is required – making them accessible to a diverse range of talented individuals. Participants work in teams, pooling their skills to tackle technical problems through software, hardware or a business plan for a new product. Physicists, computer scientists, engineers and entrepreneurs all bring their strengths to the table. Young scientists can pursue work that may not fit within typical research structures, develop their skills, and build portfolios and professional networks.

“If you’re really passionate about some­thing, you should be able to jump on a project and work on it,” says Zhang. “You shouldn’t need to be associated with a university or have a PhD to pursue it.”

For early-career researchers, hackathons offer more than just technical challenges. They provide an alternative entry point into research and industry, bridging the gap between academia and real-world applications. University-run hackathons often attract corporate sponsors, giving them the budget to rent out stadiums with hundreds, sometimes thousands, of attendees.

“These large-scale hackathons really capture the attention of headhunters and mentors from industry,” explains Zhang. “They see the events as a recruitment pool. It can be a really effective way to advance careers and speak to representatives of big companies, as well as enhancing your coding skills.”

In the 2010s, weekend hackathons served as Zhang’s stepping stone into entrepreneurship. “I used to sit in the computer-science common room and work on my hacks. That’s how I met most of my friends,” recalled Zhang. “But later I realised that to build something great, I had to effectively organise people and capital. So I started to skip my computer-science classes and sneak into the business classrooms.” Zhang would hide in the back row of the business lectures, plotting his plan towards entrepreneurship. He networked with peers to evaluate different business models each day. “It was fun to combine our knowledge of engineering and business theory,” he added. “It made the journey a lot less stressful.”

But the transition from science to entrepreneurship was hard. “At the start you must learn and do everything yourself. The good thing is you’re exposed to lots of new skills and new people, but you also have to force yourself to do things you’re not usually good at.”

This is a dilemma many entrepreneurs face: whether to learn new skills from scratch, or to find business partners and delegate tasks. But finding trustworthy business partners is not always easy, and making the wrong decision can hinder the start up’s progress. That’s why planning the company’s vision and mission from the start is so important.

“The solution is actually pretty straight forward,” says Zhang. “You need to spend more time completing the important milestones yourself, to ensure you have a feasible product. Once you make the business plan and vision clear, you get support from everywhere.”

Decentralised community governance

Rather than hackathon participants competing for a week before abandoning their code, Zhang started DoraHacks to give teams from all over the world a chance to turn their ideas into fully developed products. “I want hackathons to be more than a recruitment tool,” he explains. “They should foster open-source development and decentralised community governance. Today, a hacker from Tanzania can collaborate virtually with a team in the US, and teams gain support to develop real products. This helps make tech fields much more diverse and accessible.”

Zhang’s company enables this by reducing logistical costs for organisers and providing funding mechanisms for participants, making hackathons accessible to aspiring researchers beyond academic institutions. As the community expands, new doors open for young scientists at the start of their careers.

“The business model is changing,” says Zhang. Hackathons are becoming fundamental to emerging technologies, particularly in areas like quantum computing, blockchain and AI, which often start out open source. “There will be a major shift in the process of product creation. Instead of building products in isolation, new technologies rely on platforms and infrastructure where hackers can contribute.”

Today, hackathons aren’t just about coding or networking – they’re about pushing the boundaries of what’s possible, creating meaningful solutions and launching new career paths. They act as incubators for ideas with lasting impact. Zhang wants to help these ideas become reality. “The future of innovation is collaborative and open source,” he says. “The old world relies on corporations building moats around closed-source technology, which is inefficient and inaccessible. The new world is centred around open platform technology, where people can build on top of old projects. This collaborative spirit is what makes the hacker movement so important.”

The value of being messy

The line between science communication and public relations has become increasingly blurred. On one side, scientific press officers highlight institutional success, secure funding and showcase breakthrough discoveries. On the other, science communicators and journalists present scientific findings in a way that educates and entertains readers – acknowledging both the triumphs and the inherent uncertainties of the scientific process.

The core difference between these approaches lies in how they handle the inevitable messiness of science. Science isn’t a smooth, linear path of consistent triumphs; it’s an uncertain, trial-and-error journey. This uncertainty, and our willingness to discuss it openly, is what distinguishes authentic science communication from a polished public relations (PR) pitch. By necessity, PR often strives to present a neat narrative, free of controversy or doubt, but this risks creating a distorted perception of what science actually is.

Finding your voice

Take, for example, the situation in particle physics. Experiments probing the fundamental laws of physics are often critiqued in the press for their hefty price tags – particularly when people are eager to see resources directed towards solving global crises like climate change or preventing future pandemics. When researchers and science communicators are finding their voice, a pressing question is how much messiness to communicate in uncertain times.

After completing my PhD as part of the ATLAS collaboration, I became a science journalist and communicator, connecting audiences across Europe and America with the joy of learning about fundamental physics. After a recent talk at the Royal Institution in London, in which I explained how ATLAS measures fundamental particles, I received an email from a colleague. The only question the talk prompted him to ask was about the safety of colliding protons, aiming to create undiscovered particles. This reaction reflects how scientific misinformation – such as the idea that experiments at CERN could endanger the planet – can be persistent and difficult to eradicate.

In response to such criticisms and concerns, I have argued many times for the value of fundamental physics research, often highlighting the vast number of technological advancements it enables, from touch screens to healthcare advances. However, we must be wary not to only rely on this PR tactic of stressing the tangible benefits of research, as it can sometimes sidestep the uncertainties and iterative nature of scientific investigation, presenting an oversimplified version of scientific progress.

From Democritus to the Standard Model

This PR-driven approach risks undermining public understanding and trust in science in the long run. When science is framed solely as a series of grand successes without any setbacks, people may become confused or disillusioned when they inevitably encounter controversies or failures. Instead, this is where honest science communication shines – admitting that our understanding evolves, that we make mistakes and that uncertainties are an integral part of the process.

Our evolving understanding of particle physics is a perfect illustration of this. From Democritus’ concept of “indivisible atoms” to the development of the Standard Model, every new discovery has refined or even overhauled our previous understanding. This is the essence of science – always refining, never perfect – and it’s exactly what we should be communicating to the public.

Embracing this messiness doesn’t necessarily reduce public trust. When presenting scientific results to the public, it’s important to remember that uncertainty can take many forms, and how we communicate these forms can significantly affect credibility. Technical uncertainty – expressing complexity or incomplete information – often increases audience trust, as it communicates the real intricacies of scientific research. Conversely, consensus uncertainty – spotlighting disagreements or controversies among experts – can have a negative impact on credibility. When it comes to genuine disagreements among scientists, effectively communicating uncertainty to the public requires a thoughtful balance. Transparency is key: acknowledging the existence of different scientific perspectives helps the public understand that science is a dynamic process. Providing context about why disagreements exist, whether due to limited data or competing theoretical frameworks, also helps in making the uncertainty comprehensible.

Embrace errors

In other words, the next time you present your latest results on social media, don’t shy away from including the error bars. And if you must have a public argument with a colleague about what the results mean, context is essential!

Acknowledging the existence of different scientific perspectives helps the public understand that science is a dynamic process

No one knows where the next breakthrough will come from or how it might solve the challenges we face. In an information ecosystem increasingly filled with misinformation, scientists and science communicators must help people understand the iterative, uncertain and evolving nature of science. As science communicators, we should be cautious not to stray too far into PR territory. Authentic communication doesn’t mean glossing over uncertainties but rather embracing them as an essential part of the story. This way, the public can appreciate science not just as a collection of established facts, but as an ongoing, dynamic process – messy, yet ultimately satisfying.

ICFA talks strategy and sustainability in Prague

ICFA, the International Committee for Future Accelerators, was formed in 1976 to promote international collaboration in all phases of the construction and exploitation of very-high-energy accelerators. Its 96th meeting took place on 20 and 21 July during the recent ICHEP conference in Prague. Almost all of the 16 members from across the world attended in person, making the assembly lively and constructive.

The committee heard extensive reports from the leading HEP laboratories and various world regions on their recent activities and plans, including a presentation by Paris Sphicas, the chair of the European Committee for Future Accelerators (ECFA), on the process for the update of the European strategy for particle physics (ESPP). Launched by CERN Council in March 2024, the ESPP update is charged with recommending the next collider project at CERN after HL-LHC operation.

A global task

The ESPP update is also of high interest to non-European institutions and projects. Consequently, in addition to the expected inputs to the strategy from European HEP communities, those from non-European HEP communities are also welcome. Moreover, the recent US P5 report and the Chinese plans for CEPC, with a potential positive decision in 2025/2026, and discussions about the ILC project in Japan, will be important elements of the work to be carried out in the context of the ESPP update. They also emphasise the global nature of high-energy physics.

An integral part of the work of ICFA is carried out within its panels, which have been very active. Presentations were given from the new panel on the Data Lifecycle (chair Kati Lassila-Perini, Helsinki), the Beam Dynamics panel (new chair Yuan He, IMPCAS) and the Advanced and Novel Accelerators panel (new chair Patric Muggli, Max Planck Munich, proxied at the meeting by Brigitte Cros, Paris-Saclay). The Instrumentation and Innovation Development panel (chair Ian Shipsey, Oxford) is setting an example with its numerous schools, the ICFA instrumentation awards and centrally sponsored instrumentation studentships for early-career researchers from underserved world regions. Finally, the chair of the ILC International Development Team panel (Tatsuya Nakada, EPFL) summarised the latest status of the ILC Technological Network, and the proposed ILC collider project in Japan.

ICFA noted interesting structural developments in the global organisation of HEP

A special session was devoted to the sustainability of HEP accelerator infrastructures, considering the need to invest efforts into guidelines that enable better comparison of the environmental reports of labs and infrastructures, in particular for future facilities. It was therefore natural for ICFA to also hear reports not only from the panel on Sustainable Accelerators and Colliders led by Thomas Roser (BNL), but also from the European Lab Directors Working Group on Sustainability. This group, chaired by Caterina Bloise (INFN) and Maxim Titov (CEA), is mandated to develop a set of key indicators and a methodology for the reporting on future HEP projects, to be delivered in time for the ESPP update.

Finally, ICFA noted some very interesting structural developments in the global organisation of HEP. In the Asia-Oceania region, ACFA-HEP was recently formed as a sub-panel under the Asian Committee for Future Accelerators (ACFA), aiming for a better coordination of HEP activities in this particular region of the world. Hopefully, this will encourage other world regions to organise themselves in a similar way in order to strengthen their voice in the global HEP community – for example in Latin America. Here, a meeting was organised in August by the Latin American Association for High Energy, Cosmology and Astroparticle Physics (LAA-HECAP) to bring together scientists, institutions and funding agencies from across Latin America to coordinate actions for jointly funding research projects across the continent.

The next in-person ICFA meeting will be held during the Lepton–Photon conference in Madison, Wisconsin (USA), in August 2025.

Tsung-Dao Lee 1926–2024

On 4 August 2024, the great physicist Tsung-Dao Lee (also known as T D Lee) passed away at his home in San Francisco, aged 97.

Born in 1926 to an intellectual family in Shanghai, Lee’s education was disrupted several times by the war against Japan. He neither completed high school nor graduated from university. In 1943, however, he took the national entrance exam and, with outstanding scores, was admitted to the chemical engineering department of Zhejiang University. He then transferred to the physics department of Southwest Associated University, a temporary setup during the war for Peking, Tsinghua and Nankai universities. In the autumn of 1946, under the recommendation of Ta-You Wu, Lee went to study at the University of Chicago under the supervision of Enrico Fermi, earning his PhD in June 1950.

From 1950 to 1953 Lee conducted research at the University of Chicago, the University of California, Berkeley and the Institute for Advanced Study, located in Princeton. During this period, he made significant contributions to particle physics, statistical mechanics, field theory, astrophysics, condensed-matter physics and turbulence theory, demonstrating a wide range of interests and deep insights in several frontiers of physics. In a 1952 paper on turbulence, for example, Lee pointed out the significant difference between fluid dynamics in two-dimensional and three-dimensional spaces, namely, there is no turbulence in two dimensions. This finding provided essential conditions for John von Neumann’s model, which used supercomputers to simulate weather.

Profound impact

During this period, Lee and Chen-Ning Yang collaborated on two foundational works in statistical physics concerning phase transitions, discovering the famous “unit circle theorem” on lattice gases, which had a profound impact on statistical mechanics and phase-transition theory.

Between 1952 and 1953, during a visit to the University of Illinois at Urbana-Champaign, Lee was inspired by discussions with John Bardeen (winner, with Leon Neil Cooper and John Robert Schrieffer, of the 1972 Nobel Prize in Physics for developing the first successful microscopic theory of superconductivity). Lee applied field-theory methods to study the motion of slow electrons in polar crystals, pioneering the use of field theory to investigate condensed matter systems. According to Schrieffer, Lee’s work directly influenced the development of their “BCS” theory of superconductivity.

In 1953, after taking an assistant professor position at Columbia University, Lee proposed a renormalisable field-theory model, widely known as the “Lee Model,” which had a substantial impact on the study of renormalisation in quantum field theory.

On 1 October 1956, Lee and Yang’s theory of parity non-conservation in weak interactions was published in Physical Review. It was quickly confirmed by the experiments of Chien-Shiung Wu and others, earning Lee and Yang the 1957 Nobel Prize in Physics – one of the fastest recognitions in the history of the Nobel Prize. The discovery of parity violation significantly challenged the established understanding of fundamental physical laws and directly led to the establishment of the universal V–A theory of weak interactions in 1958. It also laid the groundwork for the unified theory of weak and electromagnetic interactions developed a decade later.

In 1957, Lee, Oehme and Yang extended symmetry studies to combined charge–parity (CP) transformations. The CP non-conservation discovered in neutral K-meson decays in 1964 validated the importance of Lee and his colleagues’ theoretical work, as well as the later establishment of CP violation theories. The same year, Lee was appointed the Fermi Professor of Physics at Columbia.

In the 1970s, Lee published papers exploring the origins of CP violation, suggesting that it might stem from spontaneous symmetry breaking in the vacuum and predicting several significant phenomenological consequences. In 1974, Lee and G C Wick investigated whether spontaneously broken symmetries in the vacuum could be partially restored under certain conditions. They found that heavy-ion collisions could achieve this restoration and produce observable effects. This work pioneered the study of the quantum chromodynamics (QCD) vacuum, phase transitions and quark–gluon plasma. It also laid the theoretical and experimental foundation for relativistic heavy-ion collision physics.

From 1982, Lee devoted significant efforts to solving non-perturbative QCD using lattice-QCD methods. Together with Norman Christ and Fred Friedberg, he developed stochastic lattice field theory and promoted first-principle lattice simulations on supercomputers, greatly advancing lattice QCD research.

Immense respect

In 2011 Lee retired as a professor emeritus from Columbia at the age of 85. In China, he enjoyed immense respect, not only for being the first Chinese scientist (with Chen-Ning Yang) to win a Nobel Prize, but also for enhancing the level of science and education in China and promoting the Sino-American collaboration in high-energy physics. This led to the establishment and successful construction of China’s first major high-energy physics facility, the Beijing Electron–Positron Collider (BEPC). At the beginning of this century, Lee supported and personally helped the upgrade of BEPC, the Daya Bay reactor neutrino experiment and others. In addition, he initiated, promoted and executed the China–US Physics Examination and Application plan, the National Natural Science Foundation of China, and the postdoctoral system in China.

Tsung-Dao Lee’s contributions to an extraordinarily wide range of fields profoundly shaped humanity’s understanding of the basic laws of the universe.

Robert Aymar 1936–2024

Robert Aymar, CERN Director-General from January 2004 to December 2008, passed away on 23 September at the age of 88. An inspirational leader in big-science projects for several decades, including the International Thermonuclear Experimental Reactor (ITER), his term of office at CERN was marked by the completion of construction and the first commissioning of the Large Hadron Collider (LHC). His experience of complex industrial projects proved to be crucial, as the CERN teams had to overcome numerous challenges linked to the LHC’s innovative technologies and their industrial production.

Robert Aymar was educated at École Poly­technique in Paris. He started his career in plasma physics at the Commissariat à l’Énergie Atomique (CEA), since renamed the Commissariat à l’Énergie Atomique et aux Énergies Alternatives, at the time when thermonuclear fusion was declassified and research started on its application to energy production. After being involved in several studies at CEA, Aymar contributed to the design of the Joint European Torus, the European tokamak project based on conventional magnet technology, built in Culham, UK in the late 1970s. In the same period, CEA was considering a compact tokamak project based on superconducting magnet technology, for which Aymar decided to use pressurised superfluid helium cooling – a technology then recently developed by Gérard Claudet and his team at CEA Grenoble. Aymar was naturally appointed head of the Tore Supra tokamak project, built at CEA Cadarache from 1977 to 1988. The successful project served inter alia as an industrial-sized demonstrator of superfluid helium cryogenics, which became a key technology of the LHC.

As head of the Département des Sciences de la Matière at CEA from 1990 to 1994, Aymar set out to bring together the physics of the infinitely large and the infinitely small, as well as the associated instrumentation, in a department that has now become the Institut de Recherche sur les Lois Fondamentales de l’Univers. In that position, he actively supported CEA–CERN collaboration agreements on R&D for the LHC and served on many national and international committees. In 1993 he chaired the LHC external review committee, whose recommendation proved decisive in the project’s approval. From 1994 to 2003 he led the ITER engineering design activities under the auspices of the International Atomic Energy Agency, establishing the basic design and validity of the project that would be approved for construction in 2006. In 2001, the CERN Council called on his expertise once again by entrusting him to chair the external review committee for CERN’s activities.

When Robert Aymar took over as Director-General of CERN in 2004, the construction of the LHC was well under way. But there were many industrial and financial challenges, and a few production crises still to overcome. During his tenure, which saw the ramp-up, series production and installation of major components, the machine was completed and the first beams circulated. That first start-up in 2008 was followed by a major technical problem that led to a shutdown lasting several months. But the LHC had demonstrated that it could run, and in 2009 the machine was successfully restarted. Aymar’s term of office also saw a simplification of CERN’s structure and procedures, aimed at making the laboratory more efficient. He also set about reducing costs and secured additional funding to complete the construction and optimise the operation of the LHC. After retirement, he remained active as a scientific advisor to the head of the CEA, occasionally visiting CERN and the ITER construction site in Cadarache.

Robert Aymar was a dedicated and demanding leader, with a strong drive and search for pragmatic solutions in the activities he undertook or supervised. CERN and the LHC project owe much to his efforts. He was also a man of culture with a marked interest in history. It was a privilege to serve under his direction.

James D Bjorken 1934–2024

James Bjorken

Theoretical physicist James D “BJ” Bjorken, whose work played a key role in revealing the existence of quarks, passed away on 6 August aged 90. Part of a wave of young physicists who came to Stanford in the mid-1950s, Bjorken also made important contributions to the design of experiments and the efficient operation of accelerators.

Born in Chicago on 22 June 1934, James Daniel Bjorken grew up in Park Ridge, Illinois, where he was drawn to mathematics and chemistry. His father, who had immigrated from Sweden in 1923, was an electrical engineer who repaired industrial motors and generators. After earning a bachelor’s degree at MIT, he went to Stanford University as a graduate student in 1956. He was one of half a dozen MIT physicists, including his adviser Sidney Drell and future director of the SLAC National Accelerator Laboratory Burton Richter, who were drawn by new facilities on the Stanford campus. This included an early linear accelerator that scattered electrons off targets to explore the nature of the neutron and proton.

Ten years later those experiments moved to SLAC, where the newly constructed Stanford Linear Collider would boost electrons to much higher energies. By that time, theorists had proposed that protons and neutrons contained fundamental particles. But no one knew much about their properties or how to go about proving they were there. Bjorken, who joined the Stanford faculty in 1961, wrote an influential 1969 paper in which he suggested that electrons were bouncing off point-like particles within the proton, a process known as deep inelastic scattering. He started lobbying experimentalists to test it with the SLAC accelerator.

Carrying out the experiments would require a new mathematical language and Bjorken contributed to its development, with simplifications and improvements from two of his students (John Kogut and Davison Soper) and Caltech physicist Richard Feynman. In the late 1960s and early 1970s, those experiments confirmed that the proton does indeed consist of fundamental particles – a discovery honoured with the 1990 Nobel Prize in Physics for SLAC’s Richard Taylor and MIT’s Henry Kendall and Jerome Friedman. Bjorken’s role was later recognised by the prestigious Wolf Prize in Physics and the 2015 High Energy and Particle Physics Prize of the European Physical Society.

While the invention of “Bjorken scaling” was his most famous scientific achievement, Bjorken was also known for identifying a wide variety of interesting problems and tackling them in novel ways. He was somewhat iconoclastic. He also had colourful and often distinctly visual ways of thinking about physics – for instance, describing physics concepts in terms of plumbing or a baked Alaska. He never sought recognition for himself and was very generous in recognising the contributions of others.

In 1979 Bjorken headed east to become associate director for physics at Fermilab. He returned to SLAC in 1989, where he continued to innovate. Over the course of his career, among other things, he invented ideas related to the existence of the charm quark and the circulation of protons in a storage ring. He helped popularise the unitarity triangle and, along with Drell, co-wrote the widely used graduate-level textbooks Relativistic Quantum Mechanics and Relativistic Quantum Fields. In 2009 Bjorken contributed to an influential paper by three younger theorists suggesting approaches for searching for “dark” photons, hypothetical carriers of a new fundamental force.

He was also awarded the American Physical Society’s Dannie Heineman Prize, the Department of Energy’s Ernest Orlando Lawrence Award, and the Dirac Medal from the International Center for Theoretical Physics. In 2017 he shared the Robert R Wilson Prize for Achievement in the Physics of Particle Accelerators for groundbreaking theoretical work he did at Fermilab that helped to sharpen the focus of particle beams in many types of accelerators.

Known for his warmth, generosity and collaborative spirit, Bjorken passionately pursued many interests outside physics, from mountain climbing, skiing, cycling and windsurfing to listening to classical music. He divided his time between homes in Woodside, California and Driggs, Idaho, and thought nothing of driving long distances to see an opera in Chicago or dropping in unannounced at the office of some fellow physicist for deep conversations about general relativity, dark matter or dark energy – once remarking: “I’ve found the most efficient way to test ideas and get hard criticism is one-on-one conversation with people who know more than I do.”

bright-rec iop pub iop-science physcis connect