Bluefors – leaderboard other pages

Topics

Nearby supernova accounts for cosmic-ray ‘anomalies’

The observed spectrum of cosmic rays has several puzzling features. For instance, the Alpha Magnetic Spectrometer (AMS) measured an excess of positrons and antiprotons at around 100 GeV. While it is tantalising to interpret such discrepancies as dark-matter signatures, a new study shows that they could simply be due to the injection of cosmic rays by a nearby supernova, which exploded two-million years ago.

The composition and spectral properties of cosmic rays are studied with unprecedented accuracy by the AMS-02 experiment (CERN Courier October 2013 p23). Mounted on the International Space Station since May 2011, it has already detected and characterised close to 100 thousand million cosmic rays from outer space. This wealth of data constrains extremely precisely the spectral properties of electrons, positrons, protons and antiprotons, as well as of helium and heavier nuclei.

Rather than disproving earlier results by the Payload for Antimatter Matter Exploration and Light nuclei Astrophysics (PAMELA) (CERN Courier September 2011 p34), the AMS measurements confirm several cosmic-ray “anomalies”. Indeed, the observations do not follow the expected trend for galactic cosmic rays. The main differences are a softer spectral slope of protons as compared with heavier nuclei in the TeV–PeV energy range, and an excess of positrons and antiprotons above ~30 GeV.

According to a small group of astrophysicists, these puzzling discrepancies could simply be due to an additional source of cosmic rays from a nearby supernova. Led by Michael Kachelrieß from the Norwegian University of Science and Technology in Trondheim, the team demonstrates in a recent paper that this is a valid explanation. The researchers get a good match with the observations for a source injecting ~1043 J in cosmic rays with energies up to at least 30 TeV. The researchers derive this by using a code that follows the trajectories of individual cosmic rays through the galactic magnetic field (GMF) after an instantaneous injection.

They further obtain that the supernova should have been roughly aligned with the local GMF direction at a distance of several-hundred light-years and must have exploded about two-million years ago. A rough estimate based on the size of the Milky Way and the rate of about two supernovas per century (CERN Courier January/February 2006 p10), shows that there is a good chance that one has indeed exploded at the inferred distance from the Sun during the last few-million years. Interestingly, independent evidence for such a nearby supernova explosion was already derived from a deposition of the iron isotope 60Fe in a million-year-old layer of the ocean crust.

The authors suggest further possible tests of the nearby supernova interpretation, such as the measure of the Beryllium isotopic ratio 10Be/9Be, because 10Be has a radioactive lifetime of about one-million years. If the idea withstands further data from AMS-02, this would rule out the hopes that the cosmic ray “anomalies” would be a signature of something more fundamental, like dark-matter annihilation or decay.

Networking against cancer with ENLIGHT

Since the establishment of the first hospital-based proton-treatment centres in the 1990s, hadrontherapy has continued to progress in Europe and worldwide. In particular, during the last decade there has been exponential growth in the number of facilities, accompanied by a rapid increment in the number of patients treated, an expanded list of medical indications, and increasing interest in other types of ions, especially carbon. Harnessing the full potential of hadrontherapy requires the expertise and ability of physicists, physicians, radiobiologists, engineers, and information-technology experts, as well as collaboration between academic, research and industrial partners. Thirteen years ago, the necessity to catalyse efforts and co-operation among these disciplines led to the establishment of the European Network for Light Ion Hadrontherapy (ENLIGHT). Its recent annual meeting, held in Cracow in September, offered an ample overview of the current status and challenges of hadrontherapy, as well as stimulating discussion on the future organisation of the community.

Networking is key

ENLIGHT was launched in 2002 (CERN Courier May 2002 p29) with an ambitious, visionary and multifaceted plan to steer European research efforts in using ion beams for radiation therapy. ENLIGHT was envisaged not only as a common multidisciplinary platform, where participants could share knowledge and best practice, but also as a provider of training and education, and as an instrument to lobby for funding in critical research and innovation areas. During the years, the network has evolved, adapting its structure and goals to emerging scientific needs (CERN Courier June 2006 p27).

ICTR-PHE

The third edition of the International Conference on Translational Research in Radio-Oncology | Physics for Health in Europe will be held in Geneva from 15 to 19 February. This unique conference gathers scientists from a variety of fields, including detector physicists, radiochemists, nuclear-medicine physicians and other physicists, biologists, software developers, accelerator experts and oncologists.

ICTR-PHE is a biennial event, co-organised by CERN, where the main aim is to foster multidisciplinary research by positioning itself at the intersection of physics, medicine and biology. At ICTR-PHE, physicists, engineers and computer scientists share their knowledge and technologies, while doctors and biologists present their needs and vision for the medical tools of the future, therefore triggering breakthrough ideas and technological developments in specific areas.

The high standards set by the ICTR-PHE conferences have garnered not only an impressive scientific community, but also ever-increasing interest and participation from industry. ICTR-PHE 2016 is also an opportunity for companies to exhibit their products and services at the technical exhibition included in the programme.

The annual ENLIGHT meeting has always played a defining role in this evolutionary process. This year, new and long-time members were challenged to an open discussion on the future of the network, after a day and a half of inspiring talks on various aspects of hadrontherapy.

Challenges ahead

Emerging topics in all forms of radiation therapy are the collection, transfer and sharing of medical data, and the implementation of big data-analytics tools to inspect them. These tools will be crucial in implementing decision support systems, allowing treatment to be tailored to each individual patient. The flow of information in healthcare, and in particular in radiation therapy, is overwhelming not only in terms of data volume but also in terms of the diversity of data types involved. Indeed, experts need to analyse patient and tumour data, as well as complex physical dose arrays, and to correlate these with clinical outcomes that also have genetic determinants.

Hadrontherapy is facing a dilemma when it comes to designing clinical trials. In fact, from a clinical standpoint, the ever increasing number of hadrontherapy patients would allow randomised trials to be performed – that is, systematic clinical studies in which patients are treated with comparative methods to determine which is the most effective curative protocol.

However, several considerations add layers of complexity to the clinical-trials landscape: the need to compare standard photon radiotherapy not only with protons but also with carbon ions; the positive results of hadrontherapy treatments for main indications; and the non-negligible fact that most of the patients who contact a hadrontherapy centre are well informed about the technique, and will not accept being treated with conventional radiotherapy. Nevertheless, progress on clinical trials is being made. At the ENLIGHT meeting in Cracow, the two dual-ion (proton and carbon) centres in Europe – HIT, in Heidelberg (Germany) and CNAO, in Pavia (Italy) – presented patient numbers and dose-distribution studies carried out at their facilities. The data were collected mainly in cohort studies carried out within a single institution, and the results often highlighted the need for larger statistics and a unified database. More data from patients treated with carbon ions will soon become available, with the opening in 2016 of the MedAustron hadrontherapy centre in Wiener Neustadt (Austria). Clinical trials are also a major focus outside of Europe: in the US, several randomised and non-randomised trials have been set up to compare protons with photons, and to investigate either the survival improvement (for glioblastoma, non-small cell lung cancer, hepatocellular carcinoma, and oesophageal cancer) or the decrease of adverse effects (low-grade glioma, oropharyngeal cancer, nasopharyngeal cancer, prostate cancer and post-mastectomy radiotherapy in breast cancer). Recently, the National Cancer Institute in the US funded a trial comparing conventional radiation therapy and carbon ions for pancreatic cancer.

Besides clinical trials, personalised treatments are holding centre stage in the scientific debate on hadrontherapy. Technology is not dormant: developments are crucial to reduce the costs, to provide treatments tailored to each specific case, and to reach the necessary level of sophistication in beam delivery to treat complex cases such as tumours inside, or close to, moving organs. In this context, imaging is key. Today, it is becoming obvious that the optimal imaging tool will necessarily have to combine different imaging modalities, for example PET and prompt photons. PET is of course a mainstay for dose imaging, but a well-known issue in its application to in-beam real-time monitoring for hadrontherapy comes from having to allow room for the beam nozzle: partial-ring PET scanners cannot provide full angular sampling, therefore introducing artefacts in the reconstructed images. The time-of-flight (TOF) technique is often used to improve the image-reconstruction process. An innovative concept, called a J-PET scanner, detects back-to-back photons in plastic scintillators, and applies compressive sensing theory to obtain a better signal normalisation, and therefore improve the TOF resolution.

A subject of broad and current interest within the hadrontherapy community is radiobiology. There has been great progress in the comprehension of molecular tumour response to irradiation with both ions and photons, and of the biological consequences of the complex, less repairable DNA damage caused specifically by ions. Understanding the cell signalling mechanisms affected by hadrontherapy will lead to improvements in therapeutic efficacy. A particularly thorny issue is the relative biological effectiveness (RBE) of protons and carbon with respect to photons. More extensive and systematic radiobiology studies with different ions, under standardised dosimetry and laboratory conditions, are needed to clarify this and other open issues: these could be carried out at existing and future beamlines at HIT, CNAO and MedAustron, as well as at the proposed CERN OpenMED facility.

The future of ENLIGHT

Since the annual meeting in Summer 2014, the ENLIGHT community has started to discuss the future of the network, both in terms of structure and scientific priorities. It is clear that the focus of R&D for hadrontherapy has shifted since the birth of ENLIGHT, if only for the simple reason that the number of clinical centres (in particular for protons) has dramatically increased. Also, while technology developments are still needed to ensure optimal and more cost-effective treatment, proton therapy is now solidly in the hands of industry. The advent of single-room facilities will bring proton therapy, albeit with some restrictions, to smaller hospitals and clinical centres.

From a clinical standpoint, the major challenge for ENLIGHT in the coming years will be to catalyse collaborative efforts in defining a road map for randomised trials and in studying the issue of RBE in detail. Concerning technology developments, efforts will continue on quality assurance through imaging and on the design of compact accelerators and gantries for ions heavier than protons. Information technologies will take centre stage, because data sharing, data analytics, and decision support systems will be key topics.

Training will be a major focus in the coming years, as the growing number of facilities require more and more trained personnel: the aim will be to train professionals who are highly skilled in their speciality but at the same time are familiar with the multidisciplinary aspects of hadrontherapy.

Over the years, the ENLIGHT community has shown a remarkable ability to reinvent itself, while maintaining its cornerstones of multidisciplinarity, integration, openness, and attention to future generations. The new list of priorities will allow the network to tackle the latest challenges of a frontier discipline such as hadrontherapy in the most effective way.

XENON opens a new era for dark-matter searches

In recent years, evidence for the existence of dark matter from astrophysical observations has become indisputable. Although the nature of dark matter remains unknown, many theoretically motivated candidates have been proposed. Among them, the most popular ones are Weakly Interacting Massive Particles (WIMPs) with predicted masses in the range from a few GeV/c2 to TeV/c2 and with interaction strengths roughly on the weak scale.

WIMPs are being searched for using three complementary techniques: indirectly, by detecting the secondary products of WIMP annihilation or decay in celestial bodies; by producing WIMPs at colliders, foremost the LHC; and by direct detection, by measuring the energy of recoiling nuclei produced by collisions with WIMPs in low-background detectors.

On 11 November 2015, the most sensitive detector for the direct detection of WIMPs, XENON1T, was inaugurated at the Italian Laboratori Nazionali del Gran Sasso (LNGS) – the largest underground laboratory in the world. XENON1T, led by Elena Aprile of Columbia University, was built and is operated by a collaboration of 21 research groups from France, Germany, Italy, Israel, the Netherlands, Portugal, Sweden, Switzerland, the United Arabic Emirates and the US. In total, about 130 physicists are involved.

XENON1T is the current culmination of the XENON programme of dark matter direct-detection experiments. Starting with the 25 kg XENON10 detector about 10 years ago, the second phase of the experiment, XENON100 (CERN Courier October 2013 p13) with 161 kg, has been tremendously successful: in the summer of 2012, the XENON collaboration published results from a search for spin-independent WIMP–nucleon interactions that provided the most stringent constraints on WIMP dark matter, until superseded by the LUX experiment (CERN Courier December 2013 p8) with a larger target.

XENON100 has since then also provided a series of other important results, such as constraints on the spin-dependent WIMP nucleon cross-section, constraints on solar axions and galactic axion-like particles and, more recently, searches for annual rate modulations, which exclude WIMP–electron scattering that could have provided a dark-matter explanation of the signal observed by DAMA/LIBRA (CERN Courier November 2015 p10).

Low background is key

The new XENON1T detector has an estimated sensitivity that is a factor of 100 better than XENON100. This will be reached after about two years of data taking. With only one week of data-taking, XENON1T will be able to reach the current LUX limit, opening up a new phase in the search for dark matter in early 2016.

The XENON detectors are dual-phase time-projection chambers (TPCs) filled with liquid xenon (LXe) as the target material. Interactions of particles in the liquefied xenon give rise to prompt scintillation light and ionisation. The ionised electrons are drifted in a strong electric field and extracted into the gas above the liquid where a secondary scintillation signal is produced. Both scintillation signals are read out by arrays of photomultiplier tubes (PMTs) placed above and below the target volume. The position of the interaction vertex can be reconstructed in 3D by using the hit pattern on the upper PMT array and the time delay between the prompt and secondary scintillation signal. The position reconstruction facilitates self-shielding by only selecting events that interact with the inner “fiducial” volume of the detector. Because of their small cross-section, WIMPs will interact only once in the detector, so the background (e.g. from neutrons) can be reduced further by selecting single-scatter interactions. Beta and gamma backgrounds are reduced by selecting events with a ratio of secondary-to-prompt signal that is typical for nuclear recoils.

The XENON1T detector is filled with about 3.5 tonnes of liquid xenon in total. Its TPC – 1 m high and 1 m in diameter in a cylindrical shape, laterally defined by highly reflective Teflon – is the largest liquid-xenon TPC ever built. Specially designed copper field-shaping electrodes ensure the uniformity of the drift field for the desired field strength of 1 kV/cm. The TPC’s active volume contains 2 tonnes of LXe viewed by two arrays of 3 inch PMTs – 121 at the bottom immersed in LXe and 127 on the top in the gaseous phase. The xenon gas is liquefied and kept at a temperature of about –95 °C by a system of pulse-tube refrigerators. The xenon gas is stored and can be recovered in liquid phase in a custom-designed stainless-steel sphere that can hold up to 7.6 tonnes of xenon in high-purity conditions. Figure 3 shows the XENON1T detector and service building situated in Hall B at LNGS. Figure 1 shows XENON collaborators active in assembling the TPC in a clean room above ground at LNGS.

The expected WIMP–nucleon interaction rate is less than 10 events in 1 tonne of xenon per year. Background rejection is therefore the key to success for direct-detection experiments. Externally induced backgrounds can be minimised by exploiting the self-shielding capabilities. In addition, the detector is surrounded by a cylindrical water vessel 10 m high and 9.6 m in diameter. It is equipped with PMTs to tag muons that could induce neutrons, with an efficiency of 99.9%.

For a detector the size of XENON1T, radioactive impurities in the detector materials and the xenon itself become the biggest challenge for background reduction. Extensive radiation-screening campaigns, using some of the world’s most sensitive germanium detectors, have been conducted, and high-purity PMTs have been specially developed by Hamamatsu in co-operation with the collaboration. Contamination of the xenon by radioactive radon (mainly 222Rn) and krypton (85Kr), which dominate the target-intrinsic background, led to the development of cryogenic-distillation techniques to suppress the abundance of these isotopes to unprecedented low levels.

The best scenario

After about two years of data taking, XENON1T will be able to probe spin-independent WIMP–nucleon cross-sections of 1.6 × 10–47 cm2 (at a WIMP mass of 50 GeV/c2), see figure 2. In popular scenarios involving supersymmetry, XENON1T will either discover WIMPs or will exclude most of the theoretically relevant parameter space. Following the inauguration, the first physics run is envisaged to start early this year.

Most of the infrastructure, for example the outer cryostat, the Cherenkov muon veto, the xenon cryogenics, the purification and storage systems and the data-acquisition system, has been dimensioned for a larger experiment, named XENONnT, which is designed to contain more than 7 tonnes of LXe. A new TPC, about 40% larger in diameter and height and equipped with about 400 PMTs, will replace the XENON1T TPC. The goal for XENONnT is to achieve another order of magnitude improvement in sensitivity within a few years of data taking. XENONnT is scheduled to start data taking in 2018.

• For further details, see www.xenon1t.org/.

LHC surpasses design luminosity with heavy ions

Résumé

Le LHC dépasse la luminosité nominale pour les ions lourds

Le LHC a terminé l’année 2015 par une exploitation avec des ions lourds. Pour la première fois, la moyenne de l’énergie dans le centre de masse obtenue lors des collisions de noyaux de plomb a atteint 5,02 TeV par paire de nucléons. Cette énergie, inédite pour les collisions plomb-plomb, est presque deux fois supérieure à celle de l’exploitation plomb-plomb précédente, en 2011, et représente quelque 25 fois celle atteinte au RHIC, à Brookhaven ; cette prouesse permet d’élargir l’étude du plasma quarks-gluons à des densités et des températures encore plus élevées.

The extensive modifications made to the LHC during its first long shutdown allowed the energy of the proton beams to be increased from 4 TeV in 2012 to 6.5 TeV, enabling proton–proton collisions at a centre-of-mass energy of 13 TeV, in 2015. As usual, a one-month heavy-ion run was scheduled at the end of the year. With lead nuclei colliding, the same fields in the LHC’s bending magnets would have allowed 5.13 TeV per colliding nucleon pair. However, it was decided to forego the last whisker of this increase to match the equivalent energy of the proton–lead collisions that took place in 2013, namely 5.02 TeV. Furthermore, the first week of the run was devoted to colliding protons at 2.51 TeV per beam. This will allow the LHC experiments to make precise comparisons of three different combinations of colliding particles, p–p, p–Pb and Pb–Pb, at the same effective energy of 5.02 TeV. This is crucial to disentangling the ascending complexity of the observed phenomena (CERN Courier March 2014 p17).

The first (and last, until 2018) Pb–Pb operation close to the full energy of the LHC was also the opportunity to finally assess some of its ultimate performance limits as a heavy-ion collider. A carefully targeted set of accelerator-physics studies also had to be scheduled within the tight time frame.

Delivering luminosity

The chain of specialised heavy-ion injectors, comprising the electron cyclotron resonance ion source, Linac3 and the LEIR ring, with its elaborate bunch-forming and cooling, were recommissioned to provide intense and dense lead bunches in the weeks preceding the run. Through a series of elaborate RF gymnastics, the PS and SPS assemble these into 24-bunch trains for injection into the LHC. The beam intensity delivered by the injectors is a crucial determinant of the luminosity of the collider.

Planning for the recommissioning of the LHC to run in two different operational conditions after the November technical stop resembled a temporal jigsaw puzzle, with alternating phases of proton and heavy-ion set-up (the latter using proton beams at first) continually readapted to the manifold constraints imposed by other activities in the injector complex, the strictures of machine protection, and the unexpected. For Pb–Pb operation, a new heavy-ion magnetic cycle was implemented in the LHC, including a squeeze to β* = 0.8 m, together with manipulations of the crossing angle and interaction-point position at the ALICE experiment. First test collisions occurred early in the morning of 17 November, some 10 hours after first injection of lead.

The new Pb–Pb energy was almost twice that of the previous Pb–Pb run in 2011, and some 25 times that of RHIC at Brookhaven, extending the study of the quark–gluon plasma to still-higher energy density and temperature. Although the energy per colliding nucleon pair characterises the physical processes, it is worth noting that the total energy packed into a volume on the few-fm scale exceeded 1 PeV for the first time in the laboratory.

After the successful collection of the required number of p–p reference collisions, the Pb–Pb configuration was validated through an extensive series of aperture measurements and collimation-loss maps. Only then could “stable beams” for physics be declared at 10.59 a.m. on 25 November, and spectacular event displays started to flow from the experiments.

In the next few days, the number of colliding bunches in each beam was stepped up to the anticipated value of 426 and the intensity delivered by the injectors was boosted to its highest-ever values. The LHC passed a historic milestone by exceeding the luminosity of 1027 cm–2 s–1, the value advertised in its official design report in 2004.

This allowed the ALICE experiment to run in its long-awaited saturated mode with the luminosity levelled at this value for the first few hours of each fill.

Soon afterwards, an unexpected bonus came from the SPS injection team, who pulled off the feat of shortening the rise time of the SPS injection kicker array, first to 175 ns then to 150 ns, allowing 474, then 518, bunches to be stored in the LHC. The ATLAS and CMS experiments were able to benefit from luminosities over three times the design value. A small fraction of the luminosity in this run was delivered to the LHCb experiment, a newcomer to Pb–Pb collisions.

Nuclear beam physics

The electromagnetic fields surrounding highly charged ultrarelativistic nuclei are strongly Lorentz-contracted into a flat “pancake”. According to the original insight of Fermi, Weizsäcker and Williams, these fields can be represented as a flash of quasi-real photons. At LHC energies, their spectrum extends up to hundreds of GeV. In a very real sense, the LHC is a photon–photon and photon–nucleus collider (CERN Courier November 2012 p9). The study of such ultraperipheral (or “near-miss”) interactions, in which the two nuclei do not overlap, is an important subfield of the LHC experimental programme, alongside its main focus on the study of truly nuclear collisions.

From the point of view of accelerator physics, the ultraperipheral interactions with their much higher cross-sections loom still larger in importance. They dominate the luminosity “burn-off”, or rate at which particles are removed from colliding beams, leading to short beam and luminosity lifetimes. Furthermore, they do so in a way that is qualitatively different from the spray of a few watts of “luminosity debris” by hadronic interactions. Rather, the removed nuclei are slightly modified in charge and/or mass, and emerge as new, well-focussed, secondary beams. These travel along the interaction region just like the main beam but, as soon as they encounter the bending magnets of the dispersion-suppressor section, their trajectories deviate, as in a spectrometer.

The largest contribution to the burn-off cross-section comes from the so-called bound-free pair-production (BFPP) in which the colliding photons create electron–positron pairs with the electron in a bound-state of one nucleus. A beam of these one-electron ions, carrying a power of some tens of watts, emerges from the interaction point and is eventually lost on the outer side of the beam pipe.

Controlled quench

The LHC operators have become used to holding their breath as the BFPP loss peaks on the beam-loss monitors rise towards the threshold for dumping the beams (figure). There has long been a concern that the energy deposited into superconducting magnet coils may cause them to quench, bringing the run to an immediate halt and imposing a limit on luminosity. In line with recent re-evaluations of the magnet-quench limits, this did not happen during physics operation in 2015 but may happen in future operation at still-higher luminosity. During this run, mitigation strategies to move the losses out of the magnets were successfully implemented. Later, in a special experiment, one of these bumps was removed and the luminosity slowly increased. This led to the first controlled steady-state quench of an LHC dipole magnet with beam, providing long-sought data on their propensity to quench. On the last night of the run, another magnet quench was deliberately induced by exciting the beam to create losses on the primary collimators.

Photonuclear interactions also occur at comparable rates in the collisions and in the interactions with the graphite of the LHC collimator jaws. Nuclei of 207Pb, created by the electromagnetic dissociation of a neutron from the original 208Pb at the primary collimators, were identified as a source of background after traversing more than a quarter of the ring to the tertiary collimators near ALICE.

These, and other phenomena peculiar to heavy-ion operation, must be tackled in the quest for still-higher performance in future years.

ATLAS and CMS upgrade proceeds to the next stage

Résumé

Le programme d’amélioration pour ATLAS et CMS passe à la phase suivante

Le projet LHC haute luminosité permettra également d’étendre le programme de physique des expériences. Des éléments clés des détecteurs devront être remplacés pour que ces instruments puissent traiter l’empilement d’interactions proton-proton – 140 à 200 en moyenne par croisement de paquets. En octobre 2015, le Comité d’examen des ressources du CERN a confirmé que les collaborations peuvent à présent élaborer des rapports de conception technique (TDR). Le franchissement de cette première étape du processus d’approbation est un grand succès pour les expériences ATLAS et CMS.

At the end of the third operational period in 2023, the LHC will have delivered 300 fb–1, and the final focussing magnets, installed at the collision points at each of four interaction regions in the LHC, will need to be replaced. By redesigning these magnets and improving the beam optics, the luminosity can be greatly increased. The High Luminosity LHC (HL-LHC) project aims to deliver 10 times the original design integrated luminosity (number of collisions) of the LHC (CERN Courier December 2015 p7). This will extend the physics programme and open a new window of discovery. But key components of the experiments will also have to be replaced to cope with the pile-up of 140–200 proton–proton interactions occurring, on average, per bunch crossing of the beams. In October 2015, the ATLAS and CMS collaborations met a major milestone in preparing these so-called Phase II detector upgrades for operation at the HL-LHC, when it was agreed at the CERN Resource Review Board that they proceed to prepare Technical Designs Reports.

New physics at the HL-LHC

The headline result of the first operation period of the LHC was the observation of a new boson in 2012. With the present data set, this boson is fully consistent with being the Higgs boson of the Standard Model of particle physics. Its couplings (interaction strengths) with other particles in the dominant decay modes are measured with an uncertainty of 15–30% by each experiment, and scale with mass as predicted (see figure 1). With the full 3000 fb–1 of the HL-LHC, the dominant couplings can be measured with a precision of 2–5%; this potential improvement is also shown in figure 1. What’s more, rare production processes and decay modes can be observed. Of particular interest is to find evidence for the production of a pair of Higgs bosons, which depends on the strength of the interaction between the Higgs bosons themselves. This will be complemented by precise measurements of other Standard Model processes and any deviations from the theoretical predictions will be indirect evidence for a new type of physics.

In parallel, the direct search for physics beyond the Standard Model will continue. The theory of supersymmetry (SUSY) introduces a heavy partner for each ordinary particle. This is very attractive in that it solves the problem of how the Higgs boson can remain relatively light, with a mass of 125 GeV, despite its interactions with heavy particles; in particular, SUSY can cancel the large corrections to the Higgs mass from the 173 GeV top quark. According to SUSY, the contributions from ordinary particles are cancelled by the contributions from the supersymmetric partners. The presence of the lightest SUSY particle can also explain the dark matter in the universe. Figure 2 compares the results achievable at the LHC and HL-LHC in a search for electroweak SUSY particles. Electroweak SUSY production has a relatively low rate, and benefits from the factor 10 increase in luminosity. Particles decaying via a W boson and a Z boson give final states with three leptons and missing transverse momentum.

Other “exotic” models will also be accessible, including those that introduce extra dimensions to explain why gravity is so weak compared with the other fundamental forces.

If a signal for new particles or new interactions begins to emerge – and this might happen in the second ongoing period of the LHC operation, which is running at higher energy compared with the first period – the experiments will have to be able to measure them precisely at the HL-LHC to distinguish between different theoretical explanations.

Experimental challenges

To achieve the physics goals, ATLAS and CMS must continue to be able to reconstruct all of the final-state particles with high efficiency and low fake rates, and to identify which ones come from the collision of interest and which come from the 140–200 additional events in the same bunch crossing. Along with this greatly increased event complexity, at the HL-LHC the detectors will suffer from unprecedented instantaneous particle flows and integrated radiation doses.

Detailed simulations of these effects were carried out to identify the sub-systems that will either not survive the high luminosity environment or not function efficiently because of the increased data rates. Entirely new tracking systems to measure charged particles will be required at the centre of the detectors, and the energy-measuring calorimeters will also need partial replacement, in the endcap region for CMS and possibly in the more forward region for ATLAS.

The possibility of efficiently selecting good events and the ability to record higher rates of data demand new triggering and data-acquisition capabilities. The main innovation will be to implement tracking information at the hardware level of the trigger decision, to provide sufficient rejection of the background signals. The new tracking devices will use silicon-sensor technology, with strips at the outer radii and pixels closer to the interaction point. The crucial role of the tracker systems in matching signals to the different collisions is illustrated in figure 3, where the event display shows the reconstruction of an interaction producing a pair of top quarks among 200 other collisions. The granularity will be increased by about a factor of five to produce a similar level of occupancies as with the current detectors and operating conditions. With reduced pixel sizes and strip pitches, the detector resolution will be improved. New thinner sensor techniques and deeper submicron technologies for the front-end read-out chips will be used to sustain the high radiation doses. And to further improve the measurements, the quantity and mass of the materials will be substantially reduced by employing lighter mechanical structures and materials, as well as new techniques for the cooling and powering schemes. The forward regions of the experiments suffer most from the high pile-up of collisions, and the tracker coverage will therefore be extended to better match the calorimetry measurements. Associating energy deposits in the calorimeters with the charged tracks over the full coverage will substantially improve jet identification and missing transverse energy measurements. The event display in figure 4 shows the example of a Higgs boson produced by the vector boson fusion (VBF) process and decaying to a pair of τ leptons.

The calorimeters in ATLAS and CMS use different technologies and require different upgrades. ATLAS is considering replacing the liquid-argon forward calorimeter with a similar detector, but with higher granularity. For further mitigation of pile-up effects, a high-granularity timing detector with a precision of a few tens of picoseconds may be added in front of the endcap LAr calorimeters. In CMS, a new high-granularity endcap calorimeter will be implemented. The detector will comprise 40 layers of silicon-pad sensors interleaved with W/Cu and brass or steel absorber to form the electromagnetic and hadronic sections, respectively. The hadronic-energy measurement will be completed with a scintillating tile section similar to the current detector. This high-granularity design introduces shower-pointing ability and high timing precision. Additionally, CMS is investigating the potential benefits of a system that is able to measure precisely the arrival time of minimum ionising particles to further improve the vertex identification for all physics objects.

The muon detectors in ATLAS and CMS are expected to survive the full HL-LHC period; however, new chambers and read-out electronics will be added to improve the trigger capabilities and to increase the robustness of the existing systems. ATLAS will add new resistive plate chambers (RPC) and small monitored drift tube chambers to the innermost layer of the barrel. The endcap trigger chambers will be replaced with small-strip thin gap chambers. CMS will complete the coverage of the current RPCs in the endcaps with high-rate capability chambers in gas electron multipliers in the front stations and RPCs in the last ones. Both experiments will install a muon tagger to benefit from the extended tracker coverage.

The trigger systems will require increased latency to allow sufficient time for the hardware track reconstruction and will also have larger throughput capability. This will require the replacement of front-end and back-end electronics for several of the calorimeter and/or muon systems that will otherwise not be replaced. Additionally, these upgrades will allow the full granularity of the detector information to be exploited at the first stage of the event selection.

Towards Technical Design Reports

To reach the first milestone in the approval process agreed with the CERN scientific committees, ATLAS and CMS prepared detailed documentation describing the entire Phase II “reference” upgrade scope and the preliminary planning and cost evaluations. This documentation includes scientific motivations for the upgrades, demonstrated through studies of the performance reach for several physics benchmarks and examined in 140 and 200 collision pile-up conditions. The performance degradation with two scenarios of reduced cost, where the upgrades are descoped or downgraded, was also investigated. After reviewing this material, the CERN LHC Committee and the Upgrade Cost Group reported to the CERN Research Board and the Resource Review Board, concluding that: “For both experiments, the reference scenario provides well-performing detectors capable of addressing the physics at the HL-LHC.”

The success of this first step of the approval process was declared, and the ATLAS and CMS collaborations are now eager to proceed with the necessary R&D and detector designs to prepare Technical Design Reports over the next two years.

• For further details, visit https://cds.cern.ch/record/2055248, https://cds.cern.ch/record/2020886 and https://cds.cern.ch/record/2055167/files/LHCC-G-165.pdf.

CAST: enlightening the dark

 

Our star has been the target of human investigation since the beginning of science. However, a plethora of observations are not yet understood. A good example is the unnaturally hot solar corona, the temperature of which spans 1–10 MK. This anomaly has been studied since 1939 but, in spite of a tremendous number of observations, no real progress in understanding its origin has been made. We also know that a significant fraction of the Sun’s total luminosity, about 4%, can escape as some form of radiation that we do not yet know, without being in conflict with the constraints imposed by the evolution of the Sun. In this framework, physicists have hypothesised the existence of exotic particles, including axions and chameleons. Other particles, such as the celebrated WIMPs, also point to the Sun as a target for relevant investigations. Indeed, over cosmic time periods, WIMPs can be gravitationally trapped inside the solar core. There, they condense, allowing their mutual annihilation into known particles, including escaping high-energy neutrinos.

A breakthrough discovery in the so-called “dark sector” could pop up at any time. The question is when this will happen and where: in an Earth-bound laboratory or in a space-bound one. It is worth stressing that it is not at all obvious whether the extreme conditions in the Sun can be completely duplicated on Earth.

Benchmark for axion searches

For many days in recent years, CAST – the CERN Axion Solar Telescope (CERN Courier April 2010 p22) – has pointed its antenna towards the Sun for about 100 minutes during sunrise and sunset. Its aim was to detect solar axions through the Primakoff effect (1950), a classic detection scheme from particle physics. This solar-axion search was completed in November 2015 (CERN Bulletin, https://cds.cern.ch/journal/CERNBulletin/2015/39/News%20Articles/2053133?ln=en), and even though CAST has not observed an axion signature, it provides world-best limits on the axion interaction strength with normal matter in the form of the magnetic field present inside the CAST magnet bores.

The results of the CAST scientific programme were also achieved thanks to the X-ray telescope (XRT) recovered from the ABRIXAS German space mission and installed downstream on one of the magnet bores. The telescope works as a lens focusing the photon flux onto the detector. Any increase in the signal-to-noise ratio would be a signature of axions. This unique technique, borrowed from astrophysics, allowed the collaboration to simultaneously measure signal and background. Given its success, a second X-ray telescope was added in 2014.

Very accurate tracking of the Sun is crucial to the experiment’s data analysis. To provide this, CERN surveyors pinpoint exactly where the telescope lies and where it is pointing to, relative to a reference in time and space. However, to be absolutely certain, twice a year, when the Sun is visible through a window in the CAST experimental hall, the magnet tracks the Sun with a camera mounted and aligned to point exactly along its axis. This process of “Sun filming” has confirmed that CAST is pointing at the centre of the Sun with sufficient precision.

Up to now, CAST has been looking for exotica that the Sun might have produced some 10 minutes earlier. However, thanks to a continuous upgrade programme for the detectors and the development of new ideas, the collaboration is now extending its horizons, back in time closer to the Big Bang and into the dark sector. In its 119th meeting, the CERN SPS and PS experiments Committee (SPSC) recommended the new CAST physics programme for approval, which includes searches for relic axions and chameleons.

Axions from the Big Bang

Due to their extremely long lifetime (longer than the age of the universe), axions produced during the Big Bang could still be detected today. These relic particles have been searched for with instruments using a resonant cavity immersed in a strong magnetic field where axions are expected to convert into photons (with a probability that depends quadratically on the magnetic-field intensity). The signal is further enhanced when the cavity is at resonance with the photon frequency. In particular, the signal strength depends on the cavity “quality factor”, defined as the ratio between the cavity fundamental frequency and the resonance line width.

However, the inherent problem of axion searches is the unknown rest mass, although the cosmologically preferred mass range for the so-called cold dark-matter axions lies between μeV/c2 and meV/c2, with a favoured region around 0.1 meV. The photon energy is equal to the axion rest mass, because its kinetic energy is negligibly small. To scan the regions of interest, the cavity resonant frequency is varied over a certain axion-mass range, basically determined by cavity size and shape.

Dipole magnets, such as the CAST magnet, can be transformed into relic axion antennas by means of new resonant microwave cavities. These cavities, designed and built by the Korean Centre for Axion and Precision Physics (CAPP) in collaboration with CERN, will be inserted inside the dipole magnetic field within the 1.7 K cold bores to search for microwave photons converted from cosmological axions, which would be direct messengers from the Big Bang era. In addition, a second microwave sensor will be inserted in the other bore. With its new set-up currently under construction, CAST should have access to an axion-mass range up to 100 μeV/c2. At these relatively high mass values, detection becomes much harder, but the hope is that this region, which is critical for the dark-matter conundrum, will also be explored.

Chameleons come on stage

As may be imagined, detecting chameleons – new scalar particles that are possible candidates for the unknown dark energy – is not a trivial matter. The CAST collaboration plans to do it by exploiting two different couplings: Primakoff coupling to photons and direct coupling to matter.

The expected energy spectrum of solar chameleons has a peak at about 600 eV, making it even harder to detect them through their Primakoff coupling than the axions. Therefore, sub-keV threshold, low-background photon detectors are required. To tackle this problem, the CAST collaboration decided to start with a Silicon Drift Detector (SDD), becoming, with recently published results, the first chameleon helioscope. The new InGRID detector, based on the MicroMegas concept and having on-board read-out electronics, replaced the CCD camera in the XRT focal plane in 2014, improving the overall expected performance of CAST for solar chameleons.

Chameleon particles are theorised to have amazing properties: they can freely traverse thick slabs of dense matter if they impinge on them normally (i.e. perpendicular to), or they can bounce off nanometre-thin membranes, not much denser than ordinary glass, when approaching them at a grazing incidence angle of just a few degrees. In doing so, they exert a minute force, much like grains of sand hitting the palm of a hand. If detected, this tiny force is the signature of the direct interaction of chameleons with matter.

Forces are experienced in everyday life, so there may seem to be nothing special about detecting them. However, sensing exceedingly tiny forces requires advanced skills and techniques. The KWISP opto-mechanical force sensor is able to instantaneously feel forces of 10–14 N – that is, the weight of a single bacterium. It uses a Si3N4 membrane, just 100 nm thick, to intercept the flux of solar chameleons. Being as elastic as a drumhead, it flexes under their collective force (pressure) by an amount less than the size of an atomic nucleus. The membrane sits inside a Fabry–Pérot optical resonator, made of two high-reflectivity super mirrors facing each other, where a standing wave from an IR laser beam is trapped. As the membrane flexes, the characteristic frequency of this wave changes, generating the signal. The power of the KWISP sensor comes from the combined response of two high-Q resonators, the optical (Fabry–Perot) and the mechanical (membrane).

In addition to KWISP, a further ingredient is necessary in the search for chameleons: a time-dependent amplitude modulation on the chameleon flux in such a way as to beat the drum at its eigenfrequency for maximum effect. To solve this problem, the authors have invented the chameleon chopper, which is basically a rotating optically flat surface, applying the principle of chameleon optics: transmission at normal incidence, reflection at grazing incidence. Surprisingly, phase-locking techniques can also exploit this angular variation to obtain additional information on chameleon physics.

According to theory, the internal surfaces of the ABRIXAS telescope, designed to reflect X-rays impinging at grazing incidence, would also reflect and focus chameleons. This increases their flux by a factor larger than 100, which is further enhanced by the exposure time gained from Sun-tracking. This unplanned ability of the X-ray telescope is one of those lucky events by which nature sometimes smiles at scientists, allowing them to explore its secrets.

The KWISP prototype is currently taking data at INFN Trieste (Italy) and a clone is being commissioned at CERN to take advantage of the CAST infrastructure. As mentioned also by the SPSC referees, with the force-sensor KWISP, it should be possible to address more fundamental physics questions, such as quantum gravity or the validity of Newton’s 1/R2 law at short distances. We plan, with colleagues from the Technical University in Darmstadt (Germany), Freiburg University (Germany) and CAPP (Korea), to develop an advanced KWISP design, aKWISP, and we welcome the interest of additional collaborators.

While it remains one of the lowest-cost astroparticle physics experiments, CAST is preparing to leap further into the dark sector. As history teaches us (see table 1), the Sun may be the key to this, although as our understanding of the Sun deepens, we will most probably uncover more mysteries about the star that gives us life.

• For more information, see https://cds.cern.ch/record/2022893.

Imaging science: physics laboratories under the spotlight

“Lighting the way for dark-matter detection and future particle-physics research.” An electric mining drill deep within the Stawell Gold Mine (SUPL). Awarded 1st people’s choice.

“The Incredibles.” This photograph was taken in the CERN restaurant, one of the key meeting points for CERN scientists. The jury noted the humanity behind the image: “There is a need for transferring ideas to make and create the world we live in.” Awarded 3rd jury’s choice.

From TRIUMF’s main control room, operators control the laboratory’s main cyclotron and proton beamlines. The jury noted the technical complexity of the science with the emotional component of the human operator. Awarded 1st jury’s choice.

A vacuum chamber containing a mirror carrying the FLAME laser beam to the experimental room of the SPARC accelerator at the INFN National Laboratory of Frascati. Awarded 2nd people’s choice.

Detail of the forward radial wire chamber forming part of the H1 detector that took data at the HERA collider at DESY from 1992 to 2007. Awarded 3rd people’s choice.

Taken in the temporary laboratory set up in the Stawell Gold Mine at SUPL, the image “gives a sense of the work that goes into particle physics long before there are data to analyse”. Awarded 2nd jury’s choice.

PS beam extraction becomes more efficient

Résumé

L’extraction des faisceaux du PS devient plus efficace

Depuis septembre 2015, les faisceaux de protons utilisés pour les expériences de physique avec cible fixe auprès du Supersynchrotron à protons du CERN sont produits par un nouveau système d’extraction à partir du Synchrotron à protons. Ce système, plus efficace, est appelé système d’extraction multitours (MTE). Les tests ayant été probants dans l’ensemble, le MTE sera, en 2016, le mode d’extraction standard pour l’exploitation avec cible fixe auprès du SPS. D’ici le milieu de l’année prochaine, une décision définitive sera prise sur le futur à long terme du MTE.

First thought of in 2002 with the goal of significantly reducing beam losses that cause high ring activation, the Multi-Turn Extraction (MTE) system (CERN Courier March 2009 p29) has, since then, encountered numerous challenges during its implementation in the Proton Synchrotron (PS). Now, the regular MTE operation over the last two months constitutes a crucial milestone for this new beam manipulation, and paves the way for further studies and optimisations.

The lows

In 2010, MTE was the default choice to deliver beam for the Super Proton Synchrotron (SPS) physics run. However, after only a few weeks of operation for the production of neutrinos in the framework of the CNGS programme, the PS extraction mode had to revert back to Continuous-Transfer (CT) extraction, which is associated with high beam losses along the ring circumference. This feature was the reason for studying an alternative extraction mode. In the new scheme, the beam is split horizontally into five beamlets – one in the centre and four in stable islands of the horizontal phase space. Unfortunately, the intensity sharing between the islands and the centre (figure 1) and the extraction trajectories were fluctuating on a cycle-to-cycle basis. This was not only significantly affecting the beam transmission through acceleration in the SPS, but also prevented proper optimisation of the SPS parameters. In addition, activation of the PS extraction region had increased anomalously, although the rest of the ring was profiting from a significant reduction in radiation levels with respect to CT operation.

Intense investigations were undertaken to find the source of the observed variations and to overcome the PS ring activation. The latter problem proved to be much easier to solve than the former.

The increased activation was tackled by designing a new piece of hardware, a so-called dummy septum, which is a passive septum with only a copper blade and no coils for generating a guiding magnetic field. The dummy septum is supposed to intercept protons from the de-bunched beam during the extraction-kickers’ rise time, therefore preventing them from interacting with the blade of the active extraction septum. This approach, combined with appropriate shielding of the dummy septum, provides a well-localised loss point that is acceptable in terms of overall activation of the PS ring. It is worth noting that the use of a shadowing device for the main extraction septum is a known approach in, for example, the SPS. However, in a small ring like the PS, the two devices cannot easily be located next to each other, which makes the overall configuration more complicated in terms of beam dynamics.

A side effect of the implementation of the dummy-septum solution is that the horizontal aperture of the PS ring is reduced, which calls for a complete review of all PS fast-extraction schemes to make them compatible with the presence of the dummy septum. This additional hurdle was overcome and the proposed solution looked acceptable on paper.

During Long Shutdown 1 (February 2013–May 2014) the design, construction, and installation of the dummy septum was completed, together with some modifications to the powering of the extraction bump. The beam commissioning of the whole system, including the new extractions, was completed successfully by the end of 2014.

To tackle the fluctuations of the extraction trajectories, systematic observations of hardware parameters in the PS ring, such as the currents of the key magnetic elements controlling the machine configuration, were undertaken. The aim was to find a correlation with the changes in intensity sharing between beamlets. Despite the long and detailed observations, no evidence for the guilty element was found, and the sources of fluctuations remained unidentified.

The highs

Activities to track down the origin of the fluctuations in the intensity sharing and of the extraction trajectories of the MTE beam resumed at the beginning of the 2015 physics run. Eventually, it was possible to identify a correlation between these variations and the amplitude of a 5 kHz ripple present on the current of some special circuits located in the PS main magnets (figure 1). The PS ring is made up of 100 combined-function main magnets with additional coils installed on the magnets’ poles. These so-called pole-face windings (PFWs) and figure-of-eight loops allow control of the working point in terms of tunes and linear chromaticities, and some higher-order parameters such as the second-order chromaticity. The ensemble of the electrical circuits present in the main magnets and the interaction between the various components is extremely complex; all switching-mode power converters feature a ripple at 5 kHz with a varying phase, which turned out to be the culprit of the observed fluctuations.

This crucial observation guided the power-converter experts to implement mitigation measures (additional inductances to filter the ripple amplitude), and their expected effect was immediately observed as improvement of the stability of the MTE beam parameters (figure 2).

This milestone again opened up the route to transfer to the SPS, and first tests of beam delivery were conducted in the summer of 2015. Their success accelerated the subsequent steps, which culminated in the decision to use MTE for fixed-target physics: after the low of 2010, MTE was back in business.

Of course, this was just the start rather than the end of efforts. The studies, which were conducted on both the PS and SPS rings, continued in view of improving the overall beam quality. The intensity was raised from an initial value of about 1.6 × 1013 protons extracted from the PS to about 2.0 × 1013 protons at the end of the run, with extraction efficiency in the PS around 98% (figure 3).

By the end of the run, the PS Booster joined the challenge, managing to produce brighter beams in the vertical plane to improve SPS transmission, which is plagued by limitation of the vertical aperture. The final touch was the finalisation of the dummy-septum configuration.

Promising future

The progress and future of MTE were discussed in two internal reviews in 2015. Given the overall success of the MTE studies, commissioning and operation during the last part of the physics run, MTE will be the standard extraction mode for the 2016 fixed-target run at the SPS. This is a necessary step to acquire better knowledge of the new beam manipulation, and also understand the limitations coming from the existing hardware. In parallel, studies to probe the beam behaviour for even higher intensities will be carried out. This is important in view of future projects requiring even more intense beams than those delivered today. By the middle of next year, a firm decision concerning the long-term future of MTE and its predecessor CT will be taken.

• For all MTE-related documents (design report, publications and talks), visit ab-project-mte.web.cern.ch/AB-Project-MTE/Documents.html.

Science: a model for collaboration?

Today, science, technology and innovation are among the most powerful forces driving social change and development. However, what is the actual role that a fundamental science laboratory like CERN can have when it comes to designing creative strategies to strengthen public goods in society?

With a view to contribute to the realisation of the 2030 Agenda for Sustainable Development, which was approved in September by the United Nations Member States, the United Nations Office at Geneva (UNOG) hosted a one-day symposium organised in collaboration with CERN and with the support of Switzerland and France, in their capacity as CERN host states.

Air, water, biodiversity, education, knowledge, access to the Internet, peace and welfare: these public goods can be preserved only with the involvement of all stakeholders. At the event, policy makers, diplomats, ambassadors, scientists, intellectuals, epistemic associations, representatives of international governmental and non-governmental organisations, and civil-society representatives explored the value of CERN as a model for co-operation.

CERN is a recognised example of peaceful international collaboration based on transparency, openness and inclusion. The invention of the World Wide Web is emblematic of the spirit that drives advances in basic science, which enable open innovation and education, and connect the worldwide community through shared values. A typical scientific community is self-organised and able to share the infrastructure needed by all. The centrality of knowledge (scientific arguments in global policy), long-term thinking, agile project and risk management, even under harsh conditions of unpredictability, smart governance and social networking, big data, considering alternative scenarios – these are all features and “goods” that belong to the scientific world but that also play a role in different contexts.

Necessary compromises

The UN world has an impressive infrastructure that ensures global governance, including the UN Secretariat, the General Assembly and the Economic and Social Council. These structures permitted a global consultation process that led to the formulation and adoption of the 2030 agenda, with its 17 sustainable-development goals and 169 targets. The complexity of the governance of such an important process is the result of necessary compromises. It is everybody’s duty to make it efficient and capable of addressing the difficult global challenges the world is now facing. CERN can contribute by explaining its functioning model and by providing, when needed, direct input on science, technology and education.

About one week after the symposium at UNOG, CERN and the World Academy of Arts and Sciences (WAAS) hosted another one-day conference to discuss the topic of “Science, technology, innovation and social responsibility”. The event was organised under the auspices of UNOG and saw the participation of the EPS and some Geneva-based international organisations, including the International Labour Office (ILO), World Health Organization (WHO), United Nations Institute for Training and Research (UNITAR), World Meteorological Organization (WMO), International Organization for Standardization (ISO) and World Intellectual Property Organization (WIPO). The specific objective was to survey the potential impact of scientific and technological innovation in different fields on the progress of humanity, independent of political boundaries or limits, whether spiritual or physical.

Lively discussions took place around the topic of social responsibility that comes with self-governance of the scientific community. Obviously, this is of particular relevance when it comes to dealing with health-related issues. In particular, fighting certain types of disease requires a strong collaboration between the scientific community, governments and companies producing vaccines. But scientific and technological developments also have a huge impact on the labour market. In this respect, science and society are not sufficiently synchronised, and future planning needs to be better co-ordinated. In fields such as meteorology, scientific co-operation is accepted as essential, because without it every country would lose: predictions and warnings are possible only with global exchange of data.

All of these initiatives show the importance of keeping the dialogue between scientists, diplomats, policy makers, business experts and the public at large constantly alive. Since the very beginning of the scientific venture that gave birth to CERN, people from different cultures, religions and political opinions could speak the common language of science. In this scenario, peace appears as a natural consequence and becomes an attitude. More than 60 years of peaceful and fruitful collaboration are the tangible result that science can indeed serve as a successful model to follow.

A video recording of “The CERN model, United Nations and Global Public Goods: addressing global challenges” is available at webtv.un.org/watch/panel-1-the-cern-model-science-education-and-global-public-good-cern-unog-symposium-2015/4590293913001. A video recording of the “Science, technology, innovation and social responsibility” conference is available at cds.cern.ch/record/2103652.

From the Great Wall to the Great Collider: China and the Quest to Uncover the Inner Workings of the Universe

By S Nadis and S T Yau
International Press of Boston

00000481

The volume presents the reasons behind the ambitious project pursued by a group of distinguished Chinese scientists, led by Shing-Tung Yau, professor of mathematics and physics at Harvard University, to build the next biggest particle collider in China, to continue the quest to identify the fundamental building blocks of nature.

The discovery of the Brout–Englert–Higgs boson put in place the long-sought-after missing piece of the Standard Model of particle physics. Although this model can describe the behaviour of particles with remarkable accuracy, it is actually incomplete, because it is not able to explain a range of phenomena.

Several centuries ago, Chinese emperors erected a majestic ring of fortification – the Great Wall. Today, Chinese researchers are contributing to particle physics with a project of almost comparable magnificence: the building of a giant accelerator, the Great Collider.

The book explains the scientific issues at stake, discusses the history of particle physics, and tells the story of the birth and development of the Great Collider project.

bright-rec iop pub iop-science physcis connect