Comsol -leaderboard other pages

Topics

DUNE and its CERN connection


The Deep Underground Neutrino Experiment (DUNE) is a next-generation long-baseline neutrino-oscillation experiment, currently under review by the US Department of Energy (DOE). DUNE has a potentially game-changing scientific programme for neutrino physics.

The DUNE collaboration came together in response to the US P5 report on the “Strategic Plan for US Particle Physics in the Global Context”, published in 2014, and the recommendations of the European Strategy for Particle Physics to freeze the development of neutrino beams at CERN. The P5 report called for the previously US-dominated LBNE experiment to be reformulated as a truly international scientific endeavour, incorporating the scientific goals and expertise of the worldwide neutrino-physics community, in particular those developed by LBNO in Europe. As a result, the international DUNE collaboration was formed and structured following a model that was successfully adopted by the LHC experiments.

The DUNE collaboration currently consists of almost 800 scientists and engineers from 145 institutes in 26 nations. The rapid development of this large collaboration is indicative of the global interest in neutrino physics and the innovative science made possible with the DUNE near and far detectors and the proposed Long-Baseline Neutrino Facility (LBNF) at Fermilab. The strong partnership between the US DOE and CERN already established in the LHC programme is also one of the essential components for the success of DUNE/LBNF. Construction of the CERN facility that will host two large-scale DUNE prototype detectors and a test beam has already begun.

So what is DUNE/LBNF? LBNF is a new 60–120 GeV beamline at Fermilab that can produce either an intense beam of muon neutrinos or antineutrinos. The initial beam power will be 1.2 MW (compared with the maximum planned for Fermilab’s existing NuMI beam of 700 kW for the NOvA experiment). This is just the first step for LBNF and the beam is being designed to be upgradable to at least 2.4 MW.

The neutrino beam will be directed towards a near and a far detector. DUNE’s far detector will be located 1.5 km underground at the Sanford Underground Research Facility (SURF) in South Dakota. Neutrinos will travel a distance of 1300 km through the Earth’s crust, therefore allowing the neutrino flavours to oscillate. The DUNE far detector consists of four 10 kton (fiducial) liquid-argon time projection chambers (LAr-TPCs). These detectors are very large – each will be approximately 62 × 15 × 14 m. The advantage of the LAr-TPC technology is that it allows 3D bubble-chamber-like imaging of neutrino interactions (or proton decay) in the vast detector volume. The DUNE near detector on the Fermilab site will observe the unoscillated neutrino beam, providing constraints on experimental uncertainties. By the standards of neutrino physics, the near-detector event rates are incredible – it will detect hundreds of millions of neutrino interactions. This will enable a diverse and world-leading neutrino-physics programme.


DUNE/LBNF has a broad and comprehensive scientific programme – it aims to make groundbreaking discoveries such as CP violation in the neutrino sector and measuring the corresponding CP phase. Because of the long baseline, DUNE will also conclusively determine the neutrino-mass ordering (normal versus inverted hierarchy). The sensitivity to the mass hierarchy arises because the neutrinos traverse 1300 km of matter (as opposed to antimatter). These “matter effects” imply that the oscillations of muon neutrinos to electron neutrinos are expected to differ from those of the corresponding process for antineutrinos, independent of CP violation. DUNE will measure both CP violation and the mass hierarchy in a single experiment by utilising a wide-band beam so that the oscillations can be measured as a function of neutrino energy (covering both first- and second-oscillation maxima). One of the advantages of a LAr-TPC is that it acts as a totally active calorimeter where the energy deposits from all final-state particles are detectable, resulting in an excellent neutrino-energy measurement over the broad range of energies needed to study the first- and second-oscillation maxima. In general, the large event samples of muon neutrino/antineutrino interactions (in the disappearance channel) and electron neutrino/antineutrino interactions (in the appearance channel) will enable neutrino oscillations to be probed with unprecedented precision, providing a test of the current three-flavour neutrino paradigm – there may yet be surprises lurking in the neutrino sector.

DUNE is not only about neutrinos. The large far detector with bubble-chamber-like imaging capability, located deep underground, provides an opportunity to search for proton decay. In particular, DUNE is able to search for proton-decay modes with kaons (such as the p → K+ antineutrino), which are favoured in many SUSY scenarios. The clear topological and ionisation (dE/dx) signature of these decay modes allows for a near-background-free search – a significant advantage in capability over large water Cherenkov detectors. Furthermore, DUNE will provide unique capabilities for the observation of neutrinos from core-collapse supernova bursts (SNBs). While water Cherenkov detectors are primarily sensitive to electron antineutrinos from SNBs, DUNE is mostly sensitive to the electron neutrinos. This would enable DUNE to directly observe the neutron-star-formation stage (p + e → n + νe) in “real time”, albeit delayed by the time that it takes for neutrinos to reach the Earth – this would be a truly remarkable observation. There is even the possibility to observe the formation of a black hole as a sharp cut-off in the time spectrum of the SNB neutrinos, if the black hole were to form a few seconds after the stellar-core collapse.

CERN’s role

CERN is playing a crucial role in prototyping the DUNE far detector and in the detailed understanding of its performance. Following the recommendations of the European Strategy document, CERN has set up a programme to fulfil the needs of large-scale neutrino-detector prototyping. In the framework of this programme, a new neutrino “platform” is being brought to light in the North Area. The new CERN facility will be available for experiments in the autumn of 2016 and will include a 70 m extension of the EHN1 experimental hall, which will host the large experimental apparatus and expose them to charged-particle test beams. The plan is to operate the first charged-particle beams in 2017 after the civil engineering and infrastructure work needed to upgrade the experimental hall has been completed.


To deliver the DUNE far detector requires the LAr-TPC technology to be scaled up to an industrial scale. The CERN platform will support the development of the single-phase and dual-phase liquid-argon technologies that are being considered on a large scale for the DUNE far detectors. In the single-phase approach, the ionisation electrons produced by charged particles are drifted towards read-out wire planes in the liquid-argon volume. In the dual-phase approach, the ionisation electrons are amplified in gaseous argon above the liquid surface and then read out. The CERN platform will host two large-scale prototypes for the DUNE far detector – ProtoDUNE and WA105.

ProtoDUNE is the engineering prototype for the single-phase far-detector design currently planned for the first 10 kton far-detector module. ProtoDUNE is based on the pioneering work carried out for the ICARUS detector operated at the Gran Sasso underground laboratory. The ICARUS detector, with its 600 tonnes of liquid argon, took data from 2010 to 2012. It demonstrated that a liquid-argon TPC detector can provide detailed images of charged particles and electromagnetic showers, with excellent spatial and calorimetric resolution. ICARUS also demonstrated the long-term stability of the LAr-TPC concept.

The WA105 demonstrator will be based on the novel dual-phase liquid-argon time projection chamber that was developed by the European LAGUNA-LBNO consortium, with R&D efforts located at CERN for more than a decade. The dual-phase approach, which offers potential advantages over the single-phase read-out, is being considered by DUNE for one or more of the DUNE far-detector 10 kton modules. The WA105 collaboration is currently building a smaller-scale 25 tonne prototype at CERN, to be operated in 2016. The larger 300 tonne WA105 demonstrator should be ready for test beam by 2018 in the EHN1 extension of the North Area at CERN.

The goal of these prototypes is to validate the construction techniques that will be adopted for the deep-underground installation at SURF, and to measure the performance of full-scale modules. In addition, the EHN1 test beams will provide the unique capability to collect and analyse charged-particle data necessary to understand the response of these detectors, with the high precision required for the DUNE science programme. The CERN neutrino platform will also serve additional R&D efforts, in particular for the DUNE near detector, where the current design utilises a straw-tube tracking chamber (inspired by the earlier NOMAD experiment at CERN), but other options, such as a high-pressure gaseous-argon TPC, are being studied.

The DUNE/LBNF scientific programme has broad support from partners in the Americas, Asia and Europe, and the collaboration is expected to grow. Progress in the last year has been rapid; DUNE/LBNF produced a four-volume Conceptual Design Report (CDR) in July 2015, detailing the design of the DUNE near and far detectors and the design of LBNF, which encompasses both the new neutrino beamline at Fermilab and civil facilities for the DUNE detectors. The CDR was a crucial element of the DOE CD-1 review of the cost range for the project. DUNE/LBNF is currently seeking DOE CD-3a approval for the underground excavation of the far-site facility that would host the four far-detector modules. The timescales are relatively short, with the start of the excavation project planned for 2017 and installation of the first far-detector module planned to start in 2021, with first commissioning for physics starting soon after. The strong role of CERN in this programme is crucial to its success.

• For further details, see lbnf.fnal.gov and www.dunescience.org.

Penetrating and puzzling: photons shed light on heavy-ion collisions

Résumé

Les émissions de photons éclairent les collisions d’ions lourds

La lumière émise dans des collisions à haute énergie de noyaux lourds révèle des informations uniques sur le plasma quark-gluon.

Quark–gluon plasma (QGP) is a thermalised state of matter at extreme temperatures consisting of deconfined quarks and gluons. Because the mean free path length of energetic photons is much larger than the size of the hot nuclear medium in which they are produced, they provide unscathed and direct information on the thermodynamic state of the QGP. In contrast to photons, hadrons that are produced after the QGP has cooled off to a temperature of about 1.8 × 1012 K (kBT = 155 MeV) mostly reflect the properties of the hadronic phase and carry only indirect information about the preceding QGP phase.

The role of direct photons

Photons are emitted over the entire duration of a heavy-ion collision via various production mechanisms. First, direct photons are distinguished from photons originating from the decay of neutral mesons, which constitute the background in the direct-photon measurement. Prompt direct photons are produced in initial hard-parton scatterings, prior to the formation of a QGP, and dominate the photon spectrum at large values of the transverse momentum (pT), beyond 4 GeV/c. Because photons do not interact with the medium, their yield, well described by perturbative quantum chromodynamics (pQCD), directly reflects the rate of initial hard-scattering processes. By contrast, the yield of high-pT hadrons is suppressed, an observation interpreted as the result of the energy lost in the QGP by quarks and gluons produced in hard-scattering processes. The interpretation of this effect, known as “jet quenching”, strongly relies on the observation that direct photons at high pT are not suppressed.

Thermal direct photons are produced in the QGP and in the subsequent hadron gas. They are expected to give a significant contribution at low pT (1 < pT < 3 GeV/c), convey information about the QGP temperature, and provide a test for models of the space–time evolution of a heavy-ion collision. For a given temperature, the spectrum of thermal photons falls off exponentially with transverse momentum, so that the temperature of the photon source can be read off from the slope. This is similar to the determination of the temperature of a red-hot heating element, or the surface of the Sun, based on the emitted thermal photon radiation. Note, however, that unlike in these examples, the photons from the QGP and the hadron gas are not in thermal equilibrium with the surrounding medium. In heavy-ion collisions, the thermal photon spectrum is an effective average over the different volume elements in the QGP and the hadron gas at different temperatures. However, during the latter stages of the collision, volume elements move with considerable velocity towards the detector. The resulting blue-shift of the spectrum leads to an apparent temperature that can be as large as twice the actual temperature of the source. Determining the initial QGP temperature therefore relies on comparison with models such as the hydrodynamic description of the evolution of heavy-ion collisions, which has proved to be very successful for hadrons. Being produced at all stages of the collisions, low-pT direct photons therefore provide an independent constraint for these models.

The ALICE experiment and experiments at RHIC have measured two distinct features of direct photons: their yield and their azimuthal anisotropy as a function of pT. ALICE has measured the direct-photon spectrum in central-to-peripheral Pb–Pb collisions at √sNN = 2.76 TeV (see figure opposite). For pT > 4 GeV/c, the spectrum agrees with the expectation for direct-photon production in initial hard-parton scatterings, as calculated by pQCD. At lower pT, most notably in central Pb–Pb collisions, there appears to be an additional source of (most likely) thermal photons from the QGP and the hadron gas. State-of-the-art hydrodynamic models agree with the measured direct-photon spectra within uncertainties; however, they tend to under-predict the central values of the measurement. For the direct-photon spectrum measured in central Au–Au collisions at √sNN = 200 GeV at RHIC, the differences between measurements and predictions become more prominent.

Anisotropies as puzzling as yields

Even more surprising, azimuthal anisotropies of low-pT direct photons measured in heavy-ion collisions were found to be much larger than predicted by hydrodynamic models. The anisotropies are a consequence of the approximately almond-shaped overlap zone of the two nuclei in non-central collisions. This gives rise to a variation of pressure gradients as a function of the azimuthal angle. As a consequence, azimuthally dependent collective flow velocities develop as the system expands, and give rise to the experimentally observed elliptic flow, an azimuthal variation of hadron yields. This is quantified by a Fourier coefficient, v2(pT). Taking into account the fact that collective flow fields take time to build up, and that photon production is dominated by photons from the early hot QGP phase, the azimuthal anisotropy v2 of direct photons at low pT was expected to be smaller than that of hadrons. The PHENIX experiment at RHIC, however, measured v2 values similar to the values of hadrons. ALICE seems to confirm this observation. These measurements could indicate that thermal photons from the late hadron gas phase outshine photons from the early QGP.

The observation that models under-predict both the thermal photon spectrum and the v2 is puzzling, and is one of the most pressing challenges for our understanding of heavy-ion collisions. The puzzle is currently more apparent in Au–Au collisions at RHIC, with much less of a tension in Pb–Pb collisions at the LHC. Its solution could be related to the hydrodynamic modelling of the space–time evolution of heavy-ion collisions, or to the theoretical description of photon production. It could also point to so-far-unknown photon production mechanisms. Many new theoretical ideas have recently been put forward in this direction. Until this puzzle is solved, the question about the role of thermal photons remains open: are they messengers of the QGP, or are thermal photons from the QGP only a small contribution buried under photons produced much later in the hadron gas? An answer is eagerly awaited.

• For further details, see arxiv.org/abs/1509.07324 and arxiv.org/abs/1212.3995, ALICE Collaboration; arxiv.org/abs/1405.3940 and arxiv.org/abs/1509.07758, PHENIX Collaboration.

A close look at the world’s largest astronomical project

Résumé

Pleins feux sur le plus grand projet astronomique du monde

Huit ans après leur première visite sur le site d’ALMA, au Chili, Paola Catapano et Mike Struik, du CERN, reviennent sur les lieux pour découvrir les 66 antennes désormais en service. La résolution obtenue, grâce à l’ampleur de l’installation, est équivalente à celle d’un télescope de 15 km de diamètre. Tous les objectifs fixés pour ALMA sont en train d’être atteints, l’un après l’autre. L’installation a commencé à être exploitée pour la science en 2011, avant son achèvement. Que peut-on attendre de cet instrument exceptionnel ? Pour les astronomes, étant donné les capacités d’ALMA, tout est possible : le prévu, et l’imprévu.

Chajnantor highlands, Chile, 5100 m above sea level. Winding up through the Echinopsis atacamensis – a rare species of centennial cacti that grows only at altitudes between 3200 and 3800 m – the 12 m-wide road to the top has not changed since my last visit to the plateau eight years ago, nor has the magnificent backdrop of the snow-capped Andean volcanoes. It’s only at the last checkpoint, at 41 km, that I catch the first glimpse of the shiny white discs of the ALMA array. What an extraordinary emotion. The once flat, huge and literally desert site is now studded with 66 giant parabolic antennas surrounding the point once marked as the Centre of Array (COFA, see photo bottom right). The mark is still there, but ALMA today is no longer just a video animation, although its 66 antennas still look unreal against the backdrop of the thin transparent air at 5100 m. Currently in an intermediate configuration, the antennas are designed to be moved from compact to long baseline, with a maximum extension of 16 km.

Reconciliation with the cosmos

Chajnantor is probably the only place on the planet that could offer such a vast, flat and dry space at this altitude. In the indigenous “kunza” language, still spoken by the Atacameños population in the San Pedro area, Chajnantor means “the place of departure, of reconciliation with the cosmos.” No better name could have been chosen for the site that now hosts this revolutionary astronomical observatory, the most advanced and powerful array of high-precision antennas searching for our cosmic origins from their solid feet on the planet. ALMA is the only instrument offering the high sensitivity and spectral resolution – 10 times better than the Hubble Space Telescope’s – needed to catch the faint radio waves emitted by the cold and so-far-invisible regions of our universe. These submillimetre and millimetre radio waves are impossible to observe with optical telescopes. “These signals come from multiple physical processes,” explains Daniel Espada, shift leader at the time of my visit to the observatory in July. “One of these is molecules, originating from molecular transitions in the cold area of the universe, which have a little bit of energy. In some regions, we can see this primordial material, essential to life, which enables us to understand their composition. These molecules travel and radiate, and we can observe them and learn about some of their properties, including temperature density. ALMA allows us to perform high-precision studies on the properties of the interstellar dust, which is key to understanding how the energy of a galaxy is emitted.” “Nobody had ever observed those frequencies with such sensitivity,” adds astronomer Gianni Marconi, one of the permanent ESO staff on the project. “Since four years, anything ALMA has observed is a press release, because nobody had ever observed those things before. They’re all firsts.”

Amazing performance

Indeed, all of the objectives set for ALMA are being achieved, one after the other, since it began operation for science in 2011, while the array was still under completion. Among its most striking “firsts”, we all remember the discovery, in 2012, of glycolaldehyde, in the vicinity of a sun-like star (CERN Courier October 2012 p13). This simple sugar molecule is one of the ingredients in the formation of ribonucleic acid. Its discovery showed that some of the building blocks of life existed in this system at the time of planet formation. The high sensitivity of ALMA – even at the technically challenging shortest wavelengths at which it operates – was instrumental for these observations, which were made with a partial array of antennas during the observatory’s science-verification phase.

Another primary scientific goal of ALMA was the observation of primordial galaxies, and it did so with just a bunch of antennas, in its compact configuration. The discovery was announced on the day of its official inauguration in March 2013. “By using the gravitational lensing technique,” explains Marconi, “we could shed light on 26 of the universe’s most primordial galaxies, where vast reservoirs of dust and gas are converted into new stars, at the pace of 10,000 per year.” The study concluded that these starburst galaxies are much more abundant, much further (about 12 billion light-years) and quite older (one-billion years) than previously assumed. Astronomers observed their formation as it started, two-billion years after the Big Bang. One of the galaxies discovered was already in existence a mere one-billion years after the birth of the universe, its light travelling the length of the cosmos since that time.

Since late 2014, ALMA has been experimenting with its very-long-baseline interferometry, using 22–36 antennas arranged with a baseline of up to the maximum 16 km. The resolution obtained with such an extended configuration is equivalent to that of a telescope as large as 15 km in diameter. Such a resolution enabled ALMA, among other things, to observe, for the first time, the existence of an incredibly powerful magnetic field in the close vicinity of the event horizon of a supermassive black hole in the centre of a distant galaxy (CERN Courier June 2015 p12). “Up to now, only weak magnetic fields several light-years (and not days) far from black holes had been probed,” confirms Marconi. Daniel Espada’s favourite record among such an impressive series of “firsts” is another recent one, made by ALMA in its almost final configuration (54 antennas). “The most spectacular result, for me, is the unprecedented observation of proto-planetary systems,” says Espada. “Seeing the formation of a planet-forming disc around a young star, basically a solar system like ours, live, as it happens, was an incredible breakthrough and left all of us astounded.”

In the recently published images of the discovery of a proto-planetary system in our Galaxy, about 450 light-years away, astronomers could clearly identify dust and gas gradually forming into planets and asteroids, in the same way as it happened to Saturn’s rings or even to our own solar system, a few billion years ago. The picture (CERN Courier January/February 2015 p15) is so far the sharpest ever made at submillimetre wavelengths, with a resolution exceeding Hubble’s, and started “a new era in our exploration of the formation of stars and planets, and could revolutionise theories of planetary formation”, as Tim de Zeeuw, Director-General of ESO, declared in the press release. “We now know for sure our solar system is not alone,” comments Daniel.

What does this dream instrument have in store for the near future? Astronomers agree that with ALMA’s capacity, everything is possible – the expected and the unexpected. “ALMA can observe the entire sky, we can follow the ecliptic of our solar system, we can observe our Galaxy, we see the Santiago trail, we can observe beyond our Galaxy and anywhere else in the sky. We can also observe the composition of the atmosphere of planets and satellites near Earth. Together with optical telescopes, we’re looking for planets similar to Earth. We might not encounter other forms of life on these other planets, but we can observe their main characteristics and the atmosphere,” concludes Espada, before getting ready for his night shift, while I remain speechless in the extraterrestrial light of yet another out-of-this-world sunset over the Atacama desert.

Video-camera icon

Watch the video interviews with David Rabanus at cds.cern.ch/record/2062981 and cds.cern.ch/record/2062982.

Yoichiro Nambu: breaking the symmetry

 

Yoichiro Nambu passed away on 5 July 2015 in Osaka. He was awarded the Nobel Prize in Physics in 2008 “for the discovery of the mechanism of spontaneous broken symmetry in subatomic physics”. Nambu’s work in theoretical physics spanning more than half a century is prophetic, and played a key role in the development of one of the great accomplishments of 20th century physics – the Standard Model of particle physics. He was also among those who laid the foundations of string theory.

The early years

When Nambu graduated from the University of Tokyo in 1943, Japan was in the midst of the Second World War – but at the same time, Japanese physics was extremely vibrant. Among other things, a group of superb Japanese physicists were developing the framework of quantum field theory. This spark came from the work of Hideki Yukawa in the 1930s, who laid the foundations of modern particle physics by his prediction that the force between nucleons inside a nucleus is caused by the exchange of a particle (today called the pion) that, unlike the photon, had a mass. Yukawa showed that this results in a force that dies out quickly as the distance between the nucleons is increased, as opposed to electromagnetic forces, caused by a massless photon, which have infinite range. Yukawa was Japan’s first Nobel laureate, in 1949. Soon afterwards, Japan became a powerhouse of particle physics and quantum field theory. In 1965, Sin-Itiro Tomonaga received the Nobel prize (shared with Richard Feynman and Julian Schwinger) for his work on the quantum field theory of electromagnetism.

In 1948, Nambu joined a select group of theoretical physicists at the newly formed department at Osaka City University. He spent three formative years there: “I had never felt and enjoyed so much the sense of freedom.” Much of his early work dealt with quantum field theory. One influential paper dealt with the derivation of the precise force laws in nuclear physics. In the process, he derived the equation that describes how particles can bind with each other – an equation that was later derived independently by Bethe and Salpeter, and is now known commonly as the Bethe–Salpeter equation.

Nambu always felt that his work in physics was guided by a philosophy – one that was uniquely his own. During his years in Osaka, he was deeply influenced by the philosophy of Sakata and Taketani. Sakata was yet another prominent theoretical physicist in Japan at that time: he later became well known for the Sakata model, which was a precursor to the quark model of nuclear constituents. Sakata was influenced by Marxist philosophy, and together with Taketani developed a “three-stage methodology” in physics. As Nambu recalled later, Taketani used to visit the young group of theorists at Osaka and “spoke against our preoccupation with theoretical ideas, emphasised to pay attention to experimental physics. I believe that this advice has come to make a big influence on my attitude towards physics”. Together with colleagues Nishijima and Miyazawa, he immersed himself in understanding the properties of the newly discovered elementary particles called mesons.

In 1952, J R Oppenheimer invited Nambu to spend a couple of years at the Institute of Advanced Study in Princeton. By his own account, this was not a particularly fruitful period: “I was not very happy.” After a summer at Caltech, he finally came to the University of Chicago at the invitation of Marvin Goldberger. There he became exposed to a remarkably stimulating intellectual atmosphere, which epitomised Fermi’s style of “physics without boundaries”. There was no “particle physics” or “physics of metals” or “nuclear physics”: everything was discussed in a unified manner. Nambu soon achieved a landmark in the history of 20th century physics: the discovery that a vacuum can break symmetries spontaneously. And he came up with the idea while working in a rather different area of physics: superconductivity.

Symmetries of the laws of nature often provide guiding principles in physics. An example is “rotational symmetry”. Imagine yourself to be in deep space, so far away from any star or galaxy that all you can see in any direction is empty space. Things look completely identical in all directions – in particular, if you are performing an experiment, the results would not depend on if you rotated your lab slowly and did the same thing. It is this symmetry that leads to the conservation of angular momentum. Of course, the rotational symmetry is only approximate, because there are stars and galaxies that break this symmetry explicitly.

There are other situations, however, where a symmetry is broken spontaneously. One example is a magnet. The molecules inside a magnet are themselves little magnetic dipoles. If we switch on a small magnetic field, then the rotational symmetry is broken explicitly and all of the dipoles align themselves in the direction of the magnetic field. That is simple. The interesting phenomenon is that the dipoles continue to be aligned in the same direction, even after the external magnetic field is switched off. Here the rotational symmetry is broken spontaneously.

Nevertheless, the fact that the underlying laws respect rotational symmetry has a consequence: if we gently disturb one of the dipoles from its perfectly aligned position, it gently nudges its neighbours and they nudge their neighbours, and the result is a wave that propagates through the magnet. Such a wave has very low energy and is called a spin wave. This is a special case of a general phenomenon where a spontaneously broken symmetry has an associated low-energy mode, or in quantum theory an associated massless particle.

Breaking symmetry

Nambu took the concept of spontaneous symmetry breaking to a new level. He came up with this idea while trying to understand the Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity. Superconductors are materials that conduct electric current without any resistance. Superconductors also repel external magnetic fields – an effect called the Meissner effect. Inside a superconductor, electromagnetic fields are short-ranged rather than long-ranged: as if the photon has acquired a mass, like Yukawa’s mesons. However, a massive photon appears to be inconsistent with gauge invariance – a basic property of electromagnetism.

It was Nambu in 1959, and independently Philip Anderson a little earlier in 1958, who understood what was going on. They realised that (in the absence of electromagnetic interactions) the superconducting state broke the symmetry spontaneously. This symmetry is unlike the rotation symmetry that is spontaneously broken in magnets or crystals. It is a symmetry associated with the fact that electric charge is conserved. Also, if we imagine switching off the electromagnetic interaction, this symmetry breaking would also result in very low-energy waves, like spin waves in a magnet – a massless particle. Now comes a great discovery: if we switch on the electromagnetic interaction, which is there, we can undo the apparent symmetry breaking by a gauge transformation, which is local in space (and time), without any energy cost. Hence, there is no massless particle, and in fact the photon becomes massive together with a massive neutral particle, which explains the Meissner effect. The neutral scalar excitation in superconductors was discovered 20 years after it was predicted. This effortless excursion across traditional boundaries of physics characterised Nambu’s work throughout his career.

Soon after finishing his work on superconductivity, Nambu returned to particle physics. The first thing he noticed was that the Bogoliubov equations describing excitations near the Fermi surface in a superconductor are very similar to the Dirac equation that describes nucleons. The energy gap in a superconductor translates to the mass of nucleons. The charge symmetry that is spontaneously broken in a superconductor (electromagnetism switched off) also has an analogue – chiral symmetry. If the energy gap in a superconductor is a result of spontaneous symmetry breaking of charge symmetry, could it be that the mass of a nucleon is the result of spontaneous symmetry breaking of chiral symmetry? Unlike the charge symmetry in a superconductor, chiral symmetry is a global symmetry that can be truly spontaneously broken, leading to a massless particle – which Nambu identified with the pion. This is exactly what Nambu proposed in a short paper in 1960, soon followed by two papers with Jona-Lasinio.

This was a revolutionary step. In all previous examples, spontaneous symmetry breaking happened in situations where there were constituents (the molecular dipoles in a magnet, for example) and the underlying laws did not permit them to arrange themselves maintaining the symmetry. Nambu, however, proposed that there are situations where spontaneous symmetry breaking can happen in the vacuum of the world.

In physics, vacuum is the name given to “nothing”. How can a symmetry be broken – even spontaneously – when there is nothing around? The radical nature of this idea has been best described by Phil Anderson: “To me – and perhaps more to his fellow particle theorists – this seemed like a fantastic stretch of imagination. The vacuum, to us, was and always had been a vacuum – it had, since Einstein got rid of the aether, been the epitome of emptiness…I, at least, had my mind encumbered with the idea that if there was a condensate, there was something there…This is why it took a Nambu to break the first symmetry.”

Nambu was proposing that the masses of elementary particles have an origin – something we can calculate. The revolutionary nature of this idea cannot be overstated. Soon after the papers of Nambu and Jona-Lasinio, Goldstone came up with a simpler renormalisable model of superconductivity, which also illustrates the phenomenon of spontaneous symmetry breaking by construction and provided a general proof that such symmetry breaking always leads to a massless particle.

Meanwhile, in 1963 Anderson realised that the mechanism of generating masses for gauge particles that was discovered in superconductivity could be useful in elementary particle physics in the context of the nature of “vacuum of the world”. The mechanism was subsequently worked out in full generality by three independent groups, Higgs, Englert and Brout, and Guralnik, Hagen and Kibble, and is called the “Higgs mechanism”. It became the key to formulating the Standard Model of particle physics by Weinberg and Salam, building on the earlier work of Glashow, and resulting in our current understanding of electromagnetic and weak forces. The analogue of the special massive state in a superconductor is the Higgs particle, discovered at CERN in 2012.

We now know, for certain, that chiral symmetry is spontaneously broken in strong interactions. However, the final realisation of this idea had to wait until another work by Nambu.

The idea that all hadrons (particles that experience strong forces) are made of quarks was proposed by Gell-Mann, and independently Zweig, in 1964. However, the idea soon ran into serious trouble.

Now, the quarks that make up nucleons have spin ½. According to the spin-statistics theorem, they should be fermions obeying the exclusion principle. However, it appeared that if quarks are indeed the constituents of all hadrons, they cannot at the same time be fermions. To resolve this contradiction, Nambu proposed that quarks possess an attribute that he called “charm” and is now called colour. In his first proposal, quarks have two such colours. Subsequently, in a paper with M Y Han, he proposed a model with three colours. Two quarks may appear identical (and therefore cannot be on top of each other) if their colour is ignored. However, once it is recognised that their colours are different, they cease to be identical, and the usual “exclusion” of fermions does not apply. A little earlier, O Greenberg came up with another resolution: he postulated that quarks are not really fermions but something called “para-fermions”, which have unconventional properties that are just right to solve the problem.

However, it was Nambu’s proposal that turned out to be more fruitful. This is because he made another remarkable one: colour is like another kind of electric charge. A quark not only produced an ordinary electric field, but a new kind of generalised electric field. This new kind of electric field causes a new kind of force between quarks, and the energy is minimum when the quarks form a colour singlet. This force, Nambu claimed, is the basic strong force that holds the quarks together inside a nucleon. This proposal turned out to be essentially correct, and is now known as quantum chromodynamics (QCD). In the model of Han and Nambu, quarks carry integer charges, which we now know is incorrect. In 1972, Fritzsch and Gell-Mann wrote down the model with correct charge assignments and proposed that only colour singlets occur in the spectrum, which would ensure that fractionally charged quarks remain unobserved. However, it was only after the discovery by David Gross, Frank Wilczek, and David Politzer in 1973 of “asymptotic freedom” for the generalised electric field that QCD became a candidate theory of the strong interactions. It explained the observed scaling properties of the strong interactions at high energies (which probe short distances) and indicated that the force between quarks had a tendency to grow as they were pulled apart.

Simple dynamical principle

String theory, which is recognised today as the most promising framework of fundamental physics including gravity, had its origins in making sense of strongly interacting elementary particles in the days before the discovery of asymptotic freedom. To make a long story short, Nambu, Nielsen and Susskind proposed that many mathematical formulae of the day, which originated from Veneziano’s prescient formula, could be explained by the hypothesis that the underlying physical objects were strings (one-dimensional objects) rather than point particles. This was a radical departure from the “Newtonian” viewpoint that elementary laws of nature are formulated in terms of “particles” or point-like constituents.

Nambu (and independently Goto) also provided a simple dynamical principle with a large local symmetry for consistent string propagation. His famous paper on the string model entitled “Duality and hadrodynamics” was submitted to the Copenhagen High Energy Physics Symposium in 1970. In a letter dated 4 September 1986, to one of us (SRW), Nambu wrote: “In August 1970, there was a symposium to be held in Copenhagen just before a High Energy Physics Conference in Kiev, and I was planning to attend both. But before leaving for Europe, I set out to California with my family so that they could stay with our friends during my absence. Unfortunately our car broke down as we were crossing the Great Salt Lake Desert, and we were stranded in a tiny settlement called Wendover for the three days. Having missed the flight and the meeting schedules, I cancelled the trip in disgust and had a vacation in California instead. The manuscript, however had been sent out to Copenhagen, and survived.”

It is quite common for scientists to become excessively attached to their own creations. In contrast, Nambu was remarkably open-minded. To him, his work was like placing a few pieces into a giant jigsaw puzzle: he never thought that he had discovered the “ultimate truth”. This deep sense of modesty was also a part of his personality. To the entire community of physicists, he was this shy, unassuming man, often difficult to understand, coming up with one original idea after another. There was a sense of play in the way that he did science: maybe that is why his ideas were sometimes incomprehensible when they first appeared.

Nambu’s legacy, “physics without boundaries”, must have had a subconscious influence on some of us in India involved in setting up the International Centre for Theoretical Sciences (ICTS), a centre of TIFR in Bangalore, where “science is without boundaries”.

We end with a quote from Nambu’s speech at the Nobel presentation ceremony at the University of Chicago on 10 December 2008, which clearly shows his view of nature: “Nowadays, the principle of spontaneous symmetry breaking is the key concept in understanding why the world is so complex as it is, in spite of the many symmetry properties in the basic laws that are supposed to govern it. The basic laws are very simple, yet this world is not boring; that is, I think, an ideal combination.”

• An earlier version of the article appeared in Frontline magazine, see www.frontline.in/other/obituary/a-giant-of-physics/article7593580.ece.

Our Courier

The CERN Courier is not exclusively CERN’s. Its subtitle “International Journal of High-Energy Physics” stands as a friendly warning to all those readers who might otherwise think it is an official mouthpiece of the CERN laboratory. As the new editor, I share my predecessor’s vision (and hope) of producing a magazine that will interest and stimulate the entire high-energy physics community across the world.

Over the last decade, the community has expanded to encompass physicists from many different areas – not just accelerator physics and not just from CERN. Today, the high-energy frontier is being explored not only by particle physicists but also by astrophysicists, cosmologists, astroparticle physicists and neutrino physicists. We use accelerators such as the unique LHC, but also satellites and detectors installed on the International Space Station. The hard-won results of physicists worldwide are increasingly a collaborative effort, where the boundaries between the various sub-disciplines have faded to nothing.

Our ambition must be to follow the natural evolution of the high-energy physics community and continue to be its magazine for years to come. How will we achieve this? You might have already noticed a few small changes in the November issue. A first visible change is this “Viewpoint”. Up until the October issue, it could be found at the end of the issue. Now it has been placed at the start, and its role has changed from that of an opinion piece to being the opening article intended to grab the reader’s attention. Is it working? Are you reading it? Please let me know. Although this is probably the first time that we have appealed for feedback directly in these pages, the fact that the CERN Courier is open to contributions and feedback from the wider community is far from new. From when the magazine was first published online, the “Contact us” webpage has stated the following, in French and English: “CERN Courier welcomes contributions from the international high-energy physics community. These can be written in English or French, and will be published in the same language. If you have a suggestion for an article, please send your proposal to the editor.”

In other words, for many years we have been eager to hear from you. And, indeed, you have communicated with us and given your feedback, and we have published your work, your professional ambitions, and your points of view. We have been part of your life and you have been part of ours. Many thanks for that. And what does the future hold? The CERN Courier will continue to bring you its authoritative insight into scientific information; it will continue to keep you abreast of developments at CERN and other laboratories worldwide; it will continue to bring you the very best images and, where possible, the very best video clips (yes, purely “sciency” videos, produced exclusively for the CERN Courier, see “A close look at the world’s largest astronomical project” of this issue) and other multimedia material.

Being an editor of a (still) printed publication in 2015 is no easy task. Out there in the world, information flows fast. Here, at the CERN Courier, we still take time to do things properly. As Christine Sutton, the previous editor, said in her “Viewpoint” in the November issue, our ambition is to take you “behind the headlines” and bring you the real protagonists with their full stories. The CERN Courier has the space, and that space is for you.

Let me take this opportunity to thank all of our regular contributors. Most of them have collaborated with us on a voluntary basis for many years and are the backbone of the magazine. Their profiles, together with that of our new “Bookshelf” editor, Virginia Greco, are available at preview-courier.web.cern.ch/cws/our-team. Obviously, the magazine would not exist without the hundreds of contributors worldwide who send us their texts, be they a feature article or a short piece for “Faces & Places”. A big thank you to everyone.

The CERN Courier adventure continues.

Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels

By Pauline Gagnon
MultiMondes
Hardback: €29
E-book: €19
Also available at the CERN bookshop

CCboo1_09_15

Pauline Gagnon est bien connue dans la communauté des expérimentateurs au LHC car, en plus de sa contribution à l’expérience ATLAS, elle a été membre du groupe de communication du CERN de 2011 à 2014 et sur le blog Quantum Diaries elle a couvert de nombreux évènements récents liés à l’activité scientifique du laboratoire.

Le titre de son livre rédigé en français, ” Qu’est ce que le Boson de Higgs mange en hiver ” est quelque peu trompeur, car les propos de l’auteur vont bien au delà de la description du mécanisme de Brout-Englert & Higgs et de la découverte expérimentale du boson de Higgs en 2012. Son livre offre non seulement une vue d’ensemble de la physique étudiée dans les expériences au LHC, du complexe d’accélérateurs et de détecteurs réalisés pour cette recherche et des méthodes statistiques employées pour la découverte du Boson de Higgs, mais inclut aussi un chapitre qui décrit l’organisation originale (et probablement unique) des grandes collaborations internationales en physique des hautes énergies ainsi qu’un chapitre sur les transferts de technologie et de connaissance de notre domaine vers le monde économique et le grand public.

Le livre décrit aussi les liens qui relient la physique des hautes énergies à l’astrophysique, avec un chapitre consacré aux évidences expérimentales qui ont amené à augurer de l’existence de la matière noire, et à une comparaison entre le potentiel de découverte de celle-ci par des expériences sur et hors accélérateurs. Un autre chapitre est consacré à la super-symétrie, la théorie actuellement la plus populaire au delà du modèle standard pour répondre aux questions que celui-ci ne peut résoudre, et aux défis qui attendent les expériences du LHC dans les prochaines années. Le livre se termine par la discussion d’un thème qui est quelque peu déconnecté mais cher au cœur de l’auteur, à savoir la question de la diversité (en particulier l’emploi des femmes) dans le monde de la recherche scientifique.

Le livre n’est pas destiné aux spécialistes mais cible le grand public. A cette fin, l’auteur a banni toute formule mathématique et utilise souvent des analogies pour introduire les différents concepts. Les parties plus complexes ou plus détaillées sont incluses dans des encarts séparés que le lecteur peut éventuellement sauter. Dans le même esprit, chaque chapitre se termine par un résumé d’une page environ qui permet une lecture abrégée du point traité, quitte à y revenir plus tard. Le style est simple et direct, avec souvent une pointe d’humour. Le discours n’est cependant pas superficiel, et il me semble que le livre s’adresse tout de même à des lecteurs avec une certaine connaissance scientifique de base, par exemple des jeunes étudiants qui veulent comprendre l’intérêt et les buts de la recherche en physique des particules.

The Singular Universe and the Reality of Time: A Proposal in Natural Philosophy

By Roberto Mangabeira Unger and Lee Smolin
Cambridge University Press
Hardback: $20
E-book: $17

CCboo2_09_15

This is a book on natural philosophy, a field that the authors argue, and convincingly so, has not had much activity for a long time. It is definitely not a popularisation, although it is written clearly enough (and free of equations) that it should be accessible to most knowledgeable readers.

In many ways, this is two books: one of about 350 pages by Unger, a philosopher, and another of about 150 pages by Smolin, a physicist, each presenting overlapping but often dissenting views, together with a discussion of these differences. This means one can be quite comfortable reading it and agreeing or not, as each point is raised.

Perhaps the key idea is that history might play a role in determining why the universe is the way it is, in as fundamental a way as history determines much of biology. This takes on many of the fundamental assumptions that go into cosmology and physics, including the idea that the “laws” of physics are somehow hard-wired into the universe and that they could conceivably evolve. Indeed in biology, the laws that govern biology emerge as the space of living things evolves. This puts causal connections in the driving seat and is akin to taking the Darwinian viewpoint in biology over the creationist myth. A new view emerges on why things are the way they are – an alternative to some hypothetical “elegant(?)” future derivation of why, for example, masses and couplings are what they are.

The authors eschew some ideas that often occur today, including that of there being a multiverse with ourselves being in but one (the “singular” in the title means there is just one), and the idea that time is somehow not real and leading to a genuine history. They even argue that mathematics may not merit the (“prophetic”, as they put it) role that we often give it.

It’s a hard book to put down. Whether or not one agrees with the points that are raised, the book is nothing if not thought-provoking, and the ideas could well be revolutionary.

A wealth of data for physics from the LHC: 1400 colliding bunches per beam and counting

Thanks to the work done during the LHC machine-development period and technical stop at the end of the summer, the LHC is enjoying a stable-intensity ramp-up period, which is giving experiments precious data for their physics programmes.

During the machine-development break at the end of August, a variety of measurement and development programmes were carried out in the machine. They included tests for exploring the limits of smaller beam sizes at the interaction points and studies of collimation using bent crystals. Highlights also included the validation of a 40 cm β*, which effectively doubles the luminosity potential of the present set-up. Free from the challenges of high beam intensity, machine availability was high during this remarkably successful machine-development period.

This period was followed by a five-day technical stop. The key objectives were modifications to the critical Quench Protection System, the consolidation of the cooling and electrical-distribution systems, and important maintenance work on the cryogenics system. A huge number of activities were involved to make the technical stop a success.

The effort paid off: since the end of the technical stop, the LHC has gone smoothly through a complete validation period with beam, which ensures that the machine is ready for the intensity ramp-up from a machine-protection standpoint.

The validation is obtained step-by-step and with increasing intensity, both in terms of the number of bunches and the particles in each bunch. The first step consists of running through a full LHC cycle, from injection to collisions and beam dump. This is done initially with a low-intensity bunch (“probe”) to check all of the machine settings and equipment. This phase is followed by a series of collimation- and absorber-validation tests at different points in the LHC cycle. Low-intensity beams – typically the equivalent of three nominal bunches (3 × 1011 protons) – are expanded transversely or longitudinally, or are de-bunched to verify that the collimators and absorbers are correctly intercepting lost particles. The techniques for those validations have been improved progressively, and they can now be performed within 24 h in a few machine cycles.

As soon as the protection systems were validated with the probe beam, the intensity of the beam was ramped up in three steps to 459 bunches per beam – the level that had been reached before the summer stop. Further intensity ramp-ups are performed stepwise: at each step, the LHC must be operated for at least three periods of stable collisions. This is equivalent to integrating at least 20 h of operation before the next intensity step can be authorised. At each step, operators carefully analyse the data collected across many systems, in particular those related to machine protection, and give the green light for an intensity step only when all of the systems show satisfactory performance.

Following this scheme, about 10 days after the end of the break, the machine could be operated with around 1000 bunches per beam and 25 ns bunch spacing, which is the LHC design bunch spacing.

In the present beam configuration, the electron-cloud activity is still significant and considerable power is deposited onto the vacuum-chamber beam screen. For good performance of the machine, the beam-screen temperature should remain below 30 K, and this is achieved by managing the heat-load transients. This operation is particularly delicate for the cryogenic-system operation team in the CERN Control Centre during the injection and energy ramp-up of beams.

The beam intensity of Run 2 can also be measured in terms of the energy stored in each beam: with more than 1000 bunches per beam, the stored energy in each beam exceeds 100 MJ. Towards the end of September, the machine reached 150 MJ, breaking the record of Run 1 (140 MJ).

CMS observes long-range correlations in pp collisions at 13 TeV

CMS

The CMS collaboration has published its first particle-correlation result from proton–proton (pp) collisions at a centre-of-mass energy of 13 TeV. The paper describes the observation of a phenomenon first seen in nucleus–nucleus collisions, and also detected by CMS in 2010 in the initial LHC pp collision run, at a centre-of-mass energy of 7 TeV. CMS later also observed the phenomenon in proton–lead (pPb) collisions at a centre-of-mass energy of 5 TeV per nucleon pair. The phenomenon is an unexpected correlation between pairs of particles appearing in so-called high-multiplicity collisions, which are collisions that produce a large number of particles, i.e. approximately more than 100 charged particles with transverse momentum pT > 0.4 GeV/c within the pseudorapidity region |η| < 2.4. The correlation manifests itself as a ridge-like structure in a 2D angular correlation function.

Following the CMS observation at 7 TeV, interest was expressed concerning the dependence of this phenomenon on the centre-of-mass energy. To more readily address this question, CMS collected a special 13 TeV data set, with an integrated luminosity of 270 nb–1. Here, the average number of simultaneous collisions in a beam bunch crossing was as low as about 1.3, presenting conditions similar to those used for the 7 TeV analysis. Because the effect is expected to appear only in high-multiplicity events, a special trigger was developed based on the number of charged particles detected in the silicon tracker system.

Indeed, about once in every 3000 pp collisions with the highest produced particle multiplicity at 13 TeV, CMS observes an enhancement of particle pairs with small relative azimuthal angle Δφ (figure 1). It therefore appears that charged particles have a slight preference to be emitted pointing in nearly the same azimuthal direction, even if they are very far apart in terms of polar angle, which is measured by the quantity η.

Such correlations are reminiscent of effects first seen in nucleus–nucleus collisions at Brookhaven’s RHIC and later in collisions of lead–lead nuclei (PbPb) at the LHC. Nucleus–nucleus collisions produce a hot, dense medium similar to the quark–gluon plasma thought to have existed in the first microseconds after the Big Bang. The long-range correlations in PbPb collisions are interpreted to result from a hydrodynamic expansion of this medium. Such a medium was not expected in the simpler pp system, and therefore the CMS results from 2010 led to a variety of theoretical models aiming for an explanation.

Remarkably, the new 13 TeV results demonstrate that, within the experimental uncertainties, the strength of the correlation (expressed in terms of associated particle yield) does not depend on the centre-of-mass energy of the pp collision but only on the particle multiplicity. This lack of energy dependence is similar to what is observed for hydrodynamic-flow coefficients measured in nucleus–nucleus collisions at RHIC and the LHC. Compared with the pp results, pPb and PbPb collisions produce correlations that are four and 10 times stronger, respectively, but which are qualitatively very similar to the pp results. The new results from pp collisions extend the measurements to much higher multiplicities compared with those at 7 TeV, and provide the opportunity to understand this curious phenomenon better.

Supersymmetry searches: the most comprehensive ATLAS summary to date

ATLAS

ATLAS has summarised 22 Run 1 searches, using more than 310,000 models to work out where the elusive SUSY particles might be hiding.

The first run of the LHC taught us at least two significant things. First, that there really is a Higgs boson, with properties broadly in line with those predicted by the Standard Model. Second, that the hotly anticipated supersymmetric (SUSY) particles – which were believed to be needed to keep the Higgs boson mass under control – have not been found.

If, as many believe, SUSY is the solution to the Higgs-mass problem, there should be a heavy partner particle for each of the familiar Standard Model fermions and bosons. So why have we missed the super partners? Are they not present at LHC energies? Or are they just around the corner, waiting to be found?

ATLAS has recently taken stock of its progress in addressing the question of the missing SUSY particles. This herculean task examined an astonishing 500 million different models, each representing a possible combination of SUSY-particle masses. The points were drawn from the 19 parameter “phenomenological Minimal Supersymmetric Standard Model (pMSSM)” and concentrated on those models that can contribute to the cosmological dark matter.

The ambitious project involved the detailed simulation of more than 600 million high-energetic proton–proton collisions, using the power of the LHC computing grid. Teams from 22 individual ATLAS SUSY searches examined whether they had sensitivity to each of the 310,000 most promising models. This told them which combinations of SUSY masses have been ruled out by the ATLAS Run 1 searches and which masses would have evaded detection so far.

The results are illuminating. They show that in Run 1, ATLAS had particular sensitivity to SUSY particles with sub-TeV masses and with strong interactions. Their best constraints are on the fermionic SUSY partner of the gluon and, to a lesser extent, on the scalar partners of the quarks. Weakly interacting SUSY particles have been much harder to pin down, because those particles are produced more rarely. The conclusions are broadly consistent with those obtained using simplified models, which are being used to guide Run 2 SUSY searches.

The paper goes on to examine the knock-on effects of the ATLAS searches for other experiments. The ATLAS searches constrain the SUSY models that are being hunted by underground searches for dark-matter relics, and by indirect searches, including those measuring rare B-meson decays and the magnetic moment of the muon.

Today, the higher-energy of the 13 TeV LHC is bringing increased sensitivity to rare processes and to higher-mass particles. The ATLAS physics teams are excited to be using their fresh knowledge about where SUSY might be hiding to start the hunt afresh.

bright-rec iop pub iop-science physcis connect