Comsol -leaderboard other pages

Topics

ATLAS hunts for new physics with dibosons

Beyond the Standard Model of particle physics (SM), crucial open questions remain such as the nature of dark matter, the overabundance of matter compared to antimatter in the universe, and also the mass scale of the scalar sector (what makes the Higgs boson so light?). Theorists have extended the SM with new symmetries or forces that address these questions, and many such extensions predict new resonances that can decay into a pair of bosons (diboson), for example: VV, Vh, Vγ and γγ, where V stands for a weak boson (W and Z), h for the Higgs boson, and γ is a photon.

The ATLAS collaboration has a broad search programme for diboson resonances, and the most recent results using 36 fb–1 of proton–proton collision data at the LHC taken at a centre-of-mass energy of 13 TeV in 2015 and 2016 have now been released. Six different final states characterised by different boson decay modes were considered in searches for a VV resonance: 4, ℓℓνν, ℓℓqq, νqq, ννqq and qqqq, where , ν and q stand for charged leptons (electrons and muons), neutrinos and quarks, respectively. For the Vh resonance search, the dominant Higgs boson decay into a pair of b-quarks (branching fraction of 58%) was exploited together with four different V decays leading to ℓℓbb, νbb, ννbb and qqbb final states. A Zγ resonance was sought in final states with two leptons and a photon.

A new resonance would appear as an excess (bump) over the smoothly distributed SM background in the invariant mass distribution reconstructed from the final-state particles. The left figure shows the observed WZ mass distribution in the qqqq channel together with simulations of some example signals. An important key to probe very high-mass signals is to identify high-momentum hadronically decaying V and h bosons. ATLAS developed a new technique to reconstruct the invariant mass of such bosons combining information from the calorimeters and the central tracking detectors. The resulting improved mass resolution for reconstructed V and h bosons increased the sensitivity to very heavy signals.

No evidence for a new resonance was observed in these searches, allowing ATLAS to set stringent exclusion limits. For example, a graviton signal predicted in a model with extra spatial dimensions was excluded up to masses of 4 TeV, while heavy weak-boson-like resonances (as predicted in composite Higgs boson models) decaying to WZ bosons are excluded for masses up to 3.3 TeV. Heavier Higgs partners can be excluded up to masses of about 350 GeV, assuming specific model parameters.

Optical survey pinpoints dark-matter structure

During the last two decades the WMAP and Planck satellites have produced detailed maps of the density distribution of the universe when it was only 380,000 years old – the moment electrons and protons recombined into neutral hydrogen, producing today’s cosmic microwave background (CMB). The CMB measurements show that the distribution of both normal and dark matter in the universe is inhomogeneous, which is explained via a combination of inflation, dark matter and dark energy: initial quantum fluctuations in the very early universe expanded and continued to grow as gravity pulled matter together while dark energy worked to force it apart. Data from the CMB have allowed cosmologists to predict a range of cosmological parameters such as the fractions of dark energy, dark matter and normal matter.

Now, using new optical measurements of the current universe from the international Dark Energy Survey (DES), these predictions can be tested independently. DES is an ongoing, five-year survey that aims to map 300 million galaxies and tens of thousands of galaxy clusters using a 570 megapixel camera to capture light from galaxies eight billion light-years away (see figure). The camera, one of the most powerful in existence, was built and tested at Fermilab in the US and is mounted on the 4 m Blanco telescope in Chile.

The DES data sample is set to grow from 26 million to 300 million galaxies

To measure how the clumps seen in the CMB evolved from the early universe into their current state, the DES collaboration first mapped the distribution of galaxies in the universe precisely. The researchers then produced detailed maps of the matter distribution using weak gravitational lensing, which measures small distortions of the optical image due to the mass between an observer and multiple sources. The galaxies observed by DES are elongated by only a few per cent due to lensing and, since galaxies are intrinsically elliptical, it is not possible to measure the lensing from individual galaxy measurements.

The first year of DES data, which includes measurements of 26 million galaxies, has allowed researchers to measure cosmological parameters such as the matter density with a precision comparable to those made using the CMB data. The matter-density parameter, which indicates the total fraction of matter in the universe, measured using optical light is found to be fully compatible with Planck data based on measurements of microwave radiation emitted around 13 billion years ago. Combining the measurements of Planck and DES places further constraints on this crucial parameter, indicating that only about 30% of the universe consists of matter while the rest consists of dark energy. The results are also compatible with other important cosmological parameters such as the fluctuation amplitude, which indicates the amplitude of the initial density fluctuations, and further constrain measurements of the Hubble constant and even the sum of the neutrino masses.

The DES results allow for a fully independent measurement of parameters initially derived using a map of the early universe. With the DES data sample set to grow from 26 million to 300 million galaxies, cosmological parameters will be measured with even higher precision and allow more detailed comparisons with the CMB data.

From the web to a start-up near you

The core mission of CERN is fundamental research in particle physics. Yet, as a publicly funded laboratory, it also has a remit to ensure that its technology and expertise deliver prompt and tangible benefits to society wherever possible. Other physics-research laboratories and institutes were early adopters of CERN technologies, thanks to the highly collaborative nature of particle physics. Since its creation in 1954, CERN has also been active in transferring technology to industry, mainly through purchasing contracts or collaboration agreements. Through novel developments in the field of accelerator technologies and detectors, and more recently in computing and digital sciences, CERN technologies and know-how have contributed to applications in many fields, including the World Wide Web, invented at CERN by Tim Berners-Lee in 1989.

As its impact has broadened, in 1997 CERN set up a reinforced policy and team to support its knowledge- and technology-transfer activities. Twenty years later, these activities are still going strong. Some 18 start-up companies around the world are currently using CERN technology and CERN has developed a network of Business Incubation Centres (BICs) in nine different Member States. Its knowledge-transfer activities have impacted a wide range of fields, from medical and biomedical technologies to aerospace applications, safety, “industry 4.0” and the environment.

Maximising the societal impact of CERN technologies is a key aim for CERN’s knowledge-transfer activities. To do this effectively, CERN has set up a thematic forum with delegates from all of its Member States and associate Member States. Regular meetings are held at CERN, and beginning this year there will also be forum meetings dedicated to medical applications – which is one of the most prominent examples of CERN’s impact so far.

Technology for health

Early activities at CERN relating to medical applications date back to the 1970s, and have been triggered for the most part by individual initiatives. The multiwire proportional chamber conceived in 1968 by CERN physicist Georges Charpak not only opened a new era for particle physics and earned its inventor the 1992 Nobel Prize in Physics, but also found important X-ray and gamma-ray imaging applications in biology, radiology and nuclear medicine. Essential early work at CERN also contributed significantly to the development of advanced detectors and analysis techniques for positron emission tomography (PET). In particular, starting in 1975 with famous images of a mouse, CERN physicist David Townsend led important contributions to the reconstruction of PET images and to the development of 3D PET, in collaboration with the University of Geneva and the Geneva Cantonal Hospital.

After these individual efforts, in the 1990s CERN witnessed the first collaborative endeavours in medical applications. The Crystal Clear and Medipix collaborations started to explore the feasibility of developing technologies used in the LHC detectors – scintillating crystals and hybrid silicon pixel detectors, respectively – for possible medical applications, such as PET and X-ray imaging. At the same time, the Proton Ion Medical Machine Study (PIMMS) was initiated at CERN, with the aim of producing a synchrotron design optimised for treating cancer patients with protons and carbon ions. The initial design was improved by the TERA Foundation, and finally evolved into the machine built for the CNAO treatment centre in Italy, with seminal contributions from INFN. Later on, MedAustron in Austria built its treatment centre starting from the CNAO design. Beyond the initial design study, CERN contributed to the realisation of the CNAO and MedAustron treatment centres, in particular with expertise in accelerators and magnets and with training of personnel.

For the past 50 years, CERN has hosted the ISOLDE facility dedicated to the production of a large variety of radioactive ion beams for different experiments in the fields of nuclear and atomic physics, solid-state physics, materials science and life sciences. Over 1200 radioisotopes from more than 70 chemical elements have been made available for fundamental and applied research, including in the medical field. A particular highlight was the demonstration in 2012 of the efficiency of terbium-149, one of the lightest alpha emitters, for treatment at the level of single cancer cells. The growing worldwide interest in novel isotopes suitable for theragnostics, namely the possibility to perform both imaging and treatment at the same time, has motivated an extension of ISOLDE called CERN-MEDICIS (Medical Isotopes Collected from ISOLDE). This new facility will produce, as of this autumn, innovative isotopes for performing medical research at collaborating institutes (CERN Courier October 2016 p28).

Today, activities pertinent to medical applications are happening in all areas of CERN, with some compelling examples highlighted in the panel opposite. In June 2017, CERN Council approved a document setting out the “Strategy and framework applicable to knowledge transfer by CERN for the benefit of medical applications”.

Aiming high

Aerospace and particle physics might not at first seem obvious partners. However, both fields have to deal with radiation and other extreme environments, posing stringent technological requirements that are often similar. CERN operates testing facilities and develops qualification technologies for high-energy physics, which are useful for ground testing and qualification of flight equipment. This opportunity is particularly attractive for miniaturised satellites called CubeSats that typically use commercial off-the-shelf components for their electronics, since radiation qualification according to standard procedures is expensive and time-consuming. The CERN Latchup Experiment STudent sAtellite (CELESTA) intends to develop a CubeSat version of RadMon, a radiation monitor developed at CERN, and to prove that low-Earth orbit qualification can be performed in CERN’s High energy AcceleRator Mixed field facility (CHARM). CELESTA is being developed in collaboration with the University of Montpellier and this year was selected by ESA’s “Fly Your Satellite!” programme to be deployed in orbit in 2018 or 2019.

Magnesium diboride (MgB2), the high-temperature superconductor that will be used for the innovative electrical transmission lines of the high-luminosity LHC, has also demonstrated its potential for future space missions. Within the framework of the European Space Radiation Superconducting Shield (SR2S) project, which aims to demonstrate the feasibility of using superconducting magnetic shielding technology to protect astronauts from cosmic radiation, CERN successfully tested a prototype racetrack coil wound with a MgB2 superconducting tape. Astronauts’ exposure to space radiation is a major concern for future crewed missions to Mars and beyond. Monte Carlo codes such as FLUKA, initially jointly developed by CERN and INFN, and Geant4, developed and maintained by a worldwide collaboration with strong support from CERN since its conception, have been routinely used to study the radiation environment of past, recent, and future space missions. The TimePix detectors, which are USB-powered particle trackers based on the Medipix technology, are already used by NASA on board the International Space Station to accurately monitor radiation doses.

CERN’s computing expertise is also finding applications in aerospace. To solve the challenge of sharing software and codes in big-data environments, researchers at CERN have developed a system called CernVM-FS (CERN Virtual Machine File Systems), which is currently used in high-energy physics experiments to distribute about 350 million files. The system is now also being used for Euclid, a European space mission that aims to study the nature of dark matter and dark energy, to deploy software in Euclid’s nine science data centres.

CERN technologies and know-how have found concrete applications in a variety of other fields. One of them is safety: CERN’s unique working environment, which combines various types of radiation, extremely low temperatures, ultra-high magnetic fields and very high voltages, requires innovative solutions for detecting threats and preventing risks.  An example is B-rad, a portable meter to ensure radiation safety in strong magnetic fields that was initially developed by CERN’s radiation-protection group and fire brigade. With a financial contribution from the CERN Knowledge Transfer (KT) Fund, the product has been brought from lab prototype to finalised product in collaboration with an Italian company. Another example is Kryolize, a novel cryogenic safety software also supported by the CERN KT Fund. Six Kryolize licences have now been granted to other research laboratories, with potential application domains ranging from the food industry to cryogenic techniques in medicine.

CERN also taps into its technologies and creativity to address the challenge of a healthier and more sustainable planet. CERN’s contribution in this area ranges from novel biochemical sensors for water safety through novel irrigation techniques for the most challenging agricultural environments. The innovative Non Evaporable Getter (NEG) technology developed to reach ultra-high-vacuum conditions in the LHC vacuum chambers, for example, was successfully used in other applications, including thermal solar panels.

MgB2-based superconducting power cables could also offer significant power-transmission solutions for densely populated, high-load areas, and CERN is part of a consortium to build a prototype to demonstrate the feasibility of this concept.

Another buzz-worthy trend in industry is the so-called “industry 4.0”, a push towards increasing automation and efficiency in manufacturing processes with connected sensors and machines, autonomous robots and big-data technology. CERN’s accelerators, detectors and computing facilities naturally call for the use of the latest industry-4.0 technology, while the technological solutions to CERN’s own challenges can be used in the automation industry. In the field of robotics, CERN has developed TIM (Train Inspection Monorail), a mini vehicle autonomously monitoring the 27 km-long LHC tunnel and moving along tracks suspended from the tunnel’s ceiling, which can be programmed to perform real-time inspection missions. This innovation has already caught the eye of industry, in particular for autonomous monitoring of utilities infrastructure, such as underground water pipelines. Sensor technologies developed at CERN are also being used in drones, such as in the start-up Terabee, which uses them for aerial inspections and imaging services. Since their business was expanded to include CERN sensor development, the start-up won the prestigious first place in the automation category of Startup World at Automatica.

Boosting KT in practice

One of the main challenges in the knowledge-transfer sphere is to make it as easy as possible for scientists and other specialists to turn their research into innovations, and CERN invests much effort in such activities. Launched in 2011, the CERN KT Fund bridges the gap between research and industry by awarding grants to projects proposed by CERN personnel where there is high potential for positive impact on society. Since its creation, 40 projects have been funded, each receiving grants with a value of CHF15–240 thousand over a period of one or several years. Among them were projects addressing thermal management in space applications, very large-scale software distribution, distributed optical-fibre sensors and long-term data preservation for digital libraries. In 2016, two European Commission funded projects, AIDA-2020 and ARIES, incorporated a proof-of-concept fund modelled on CERN’s KT Fund.

Since the early days of technology transfer at CERN, one of the main focuses has been on knowledge transfer through people, especially early career scientists who work in industry after their contracts at CERN or who start their own company. Over the last 20 years, CERN has continued to build a general culture of entrepreneurship within the Organization through many different avenues. To assist entrepreneurs and small technology businesses in taking CERN technologies and expertise to the market, CERN has established a network of nine BICs throughout its Member States where companies can directly express their interest in adopting a CERN technology. The BIC managers provide office space, expertise, business support, access to local and national networks and support in accessing funding. There are currently 18 start-ups and spin-offs using CERN technologies in their business, with four joining BICs last year alone: Ross Robotics (exploiting software developed for production tasks at CERN); Innocryst (developing a system to identify and track gemstones); Colnec Health (using CERN’s know-how in Grid middleware technology) and Camstech (novel electrochemical sensor technologies).

Every year since 2008, students from the School of Entrepreneurship (NSE) at the Norwegian University of Science and Technology (NTNU) spend a week at CERN to evaluate the business commercial potential of CERN technologies. Three of the students attending the CERN-NTNU screening week in 2012 started the spin-off TIND, which is based on the open-source software Invenio. TIND has now, among others, contracts to host Invenio for the UNESCO International Bureau of Education, the California Institute of Technology and the Max Planck institute for Extraterrestrial Physics.

Getting the next generation of scientists into the habit of thinking about their research in terms of impact is vital for knowledge transfer to thrive. In 2015, CERN launched a series of Entrepreneurship Meet-Ups (EM-Us) to foster entrepreneurship within the CERN community. Selected CERN and external entrepreneurship experts present their expertise at informal get-togethers and the events offer a good opportunity to network. In October this year, the EM-Us are celebrating their 50th event, with over 1000 attendees since the series was created, and a new informal “KT-clinic” service has been launched.

Many more interesting projects are in the pipeline. CERN’s knowledge in superconducting technologies can be used in MRI and gantries for hadron therapy, while its skills in handling large amounts of data can benefit the health sector more widely. Detector technologies developed at CERN can be used in non-destructive testing techniques, while compact accelerators benefit the analysis of artworks. These are just some of the examples of new projects we are working on, and more initiatives will be started to meet the needs of industrial and research partners in CERN’s Member States and associate Member States for the next 20 years and beyond.

Cutting-edge medical technologies under CERN’s microscope

Novel designs for compact medical accelerators
Thanks to cutting-edge studies on beam dynamics and radio-frequency technology, along with innovative construction techniques, teams at CERN have manufactured an innovative linear accelerator designed to be compact, modular, low-cost and suitable for medical applications. The accelerator is a radio-frequency quadrupole (RFQ) operating at a frequency of 750 MHz, which had never been achieved before, and capable of producing low-intensity beams of just a few microamps with no significant losses. The high-frequency RFQ capitalises on the skills and know-how developed at CERN while designing Linac 4, and is a perfect injector for the new generation of high-frequency compact linear accelerators being developed for hadron therapy.Expertise in high-gradient accelerating structures gathered by the Compact Linear Collider (CLIC) group at CERN is also being applied to novel designs for hadron-therapy facilities, such as the cyclinac concept proposed by the TERA foundation, as well as the development of accelerators to boost the energy of medical cyclotrons to provide proton-imaging capabilities.

CERN’s know-how in cryogenic systems is also interesting for modern superconducting medical accelerators, such as the compact cyclotron being developed by CIEMAT for on-site production in hospitals of isotopes for PET.

Detectors and medical imaging
Medipix3 is a CMOS pixel detector read-out chip designed to be connected to a segmented semiconductor sensor. Like its predecessor, Medipix2, it acts as a camera taking images based on the number of particles that hit the pixels when the electronic shutter is open. However, Medipix3 aims to go much further than Medipix2 by permitting colour imaging and dead-time free operation. Ten years ago, a member of the Medipix3 collaboration founded a company in New Zealand and obtained a licence to exploit the chip for spectral computed tomography imaging – X-Ray imaging in colour. The company’s pre-clinical scanners enable researchers and clinicians to study biochemical and physiological processes in specimens and small animals. In a related development, the Timepix3 chip permits trigger-free particle tracking in a single semiconductor layer. Preliminary measurements using the previous generation of the chip point strongly to its potential for beam and dose monitoring in the hadron-therapy environment.

Since 1997, CERN’s Crystal Clear collaboration (CCC) has been using its expertise in scintillators to develop and construct PET prototypes. Their first success was the ClearPET concept: the development of several prototypes has resulted in the commercialisation of a small-animal scanner with breakthrough performance and led to the first simultaneous PET/CT image of a mouse in 2015. Starting in 2002, the CCC started developing dedicated PET scanners for breast imaging, called ClearPEM, with two prototypes undergoing clinical trials. Recent CCC developments are focused on time-of-flight PET scanners for better image quality. Via the European FP7 project EndOTOFPET_US, CCC members are developing a novel bi-modal time-of-flight PET and an ultrasound endoscope prototype dedicated to early-stage detection of pancreatic and prostate cancer.

Computing and simulations
Simulation codes initially developed for HEP, such as Geant4 and FLUKA, have also become crucial to modelling the effects of radiation on biological tissues for a variety of applications in the medical field. FLUKA is licensed to various companies in the medical field: in particular, FLUKA-based physics databases are at the core of the commercial treatment-planning systems (TPS) clinically used at HIT and CNAO, as well as of the TPS for carbon ions for MedAustron. Geant4 is adopted by thousands of users worldwide for applications in a variety of domains: examples of Geant4 extensions for use in the medical field are GATE, TOPAS and Geant4-DNA.

Computing tools, infrastructures and services developed for HEP have also great potential for applications in the medical field. CERN openlab has recently started two collaborative projects in this domain: BioDynaMo aims to design and build a cloud-based computing platform for rapid simulation of biological tissue dynamics, such as brain development; GeneROOT aims to use ROOT to analyse large genomics data sets, beginning with data from TwinsUK, the largest UK adult twins registry.

 

A brief history of knowledge transfer at CERN

• 1954: Since its creation, CERN has been active in knowledge and technology transfer, although with no formal structure.

• 1974 & 1983–1984: Two external studies consider the economic impact of CERN contracts and find that it equates to around 3–3.5 times the contract value.

• 1987: CERN’s Annual Report incorporates the first dedicated section on technology-transfer activities.

• 1988: The Industry and Technology Liaison Office (ITLO) is founded at CERN to stimulate interaction with industry, including through procurement.

• June 1997: With the support of Council, CERN sets up a reinforced structure for technology transfer.

• November 1997: The Basic Science and Technology Transfer: Means and Methods in the CERN Environment workshop helps to identify appropriate technology-transfer mechanisms.

• 1998: CERN develops a series of reports on intellectual-property-rights protection practices that was endorsed by the finance committee.

• 1999: First technology-transfer policy at CERN, with the new technology-transfer service replacing the ITLO and three main actions: to encourage the protection of intellectual-property rights for new technologies developed at CERN and the institutes participating in its scientific programme; to promote the training of young scientists in intellectual-property rights; and to promote entrepreneurship.

• 2001: CERN begins to report on its technology-transfer activities annually to the CERN finance committee.

• 2008: creation of HEPTECH, a technology-transfer network for high-energy physics.

• 2010: CERN develops a new policy on the management of intellectual property in technology-transfer activities at CERN.

• 2014: OECD publishes a report entitled “The Impacts of Large Research Infrastructures on Economic Innovation and on Society: Case studies at CERN”, which praises innovation at CERN.

• 2016: CERN featured as leader in World Intellectual Property Organisation (WIPO) Global Innovation Index.

• 2017: CERN publishes new medical-applications strategy, works on a set of updated software and spin-off and patent policies, and launches a revamped knowledge-transfer website: kt.cern.

 

Neutrinos on nuclei

A major focus of experiments at the Large Hadron Collider (LHC) is to search for new phenomena that cannot be explained by the Standard Model of particle physics. In addition to sophisticated analysis routines, this requires detailed measurements of particle tracks and energy deposits produced in large detectors by the LHC’s proton–proton collisions and, in particular, precise knowledge of the collision energy. The LHC’s counter-rotating proton beams each carry an energy of 6.5 TeV and this quantity is known to a precision of about 0.1 per cent – a feat that requires enormous technical expertise and equipment.

So far, no clear signs of physics beyond the Standard Model (BSM) have been detected at the LHC or at other colliders where a precise knowledge of the beam energy is needed. Indeed, the only evidence for BSM physics has come from experiments in which the beam energy is known very poorly. In 1998, in work that would lead to the 2015 Nobel Prize in Physics, researchers discovered that neutrinos have mass and that, therefore, these elementary particles cannot be purely left-handed, as had been assumed by the Standard Model. The discovery came from the observation of oscillations of atmospheric and solar neutrinos. The energies of the latter are determined by various elementary processes in the Sun and cover a wide range from a few keV up to about 20 MeV.

Since then, dedicated “long-baseline” experiments have started to explore neutrino oscillations under controlled conditions by sending neutrino beams produced in accelerator labs to detectors located hundreds of kilometres away. The T2K experiment in Japan shoots a beam from JPARC into the Super-Kamiokande underground detector about 300 km away, while the NOvA experiment in the US aims a beam produced at Fermilab to an above-ground detector near the Canadian border about 800 km away. Finally, the international Deep Underground Neutrino Experiment (DUNE), for which prototype detectors are being assembled at CERN (see “Viewpoint: CERN’s recipe for knowledge transfer”), will send a neutrino beam from Fermilab over a distance of 1300 km to a detector in the old Homestake gold mine in South Dakota. The targets in these experiments are all nuclei, rather than single protons, and the neutrino-beam energies range from a few 100 MeV to about 30 GeV.

Such poor knowledge of neutrino-beam energies is no longer acceptable for the science that awaits us. All long-baseline experiments aim to determine crucial neutrino properties, namely: the neutrino mixing angles; the value of a CP-violating phase in the neutrino sector; and the so far unknown mass ordering of the three neutrino flavours. Extracting these parameters is only possible by knowing the incoming neutrino energies, and these have to be reconstructed from observations of the final state of a neutrino–nucleus reaction. This calls for Monte Carlo generators that, unlike those used in high-energy physics, not only describe elementary particle reactions and their decays but also reactions with the nuclear environment.

Beam-energy problem

Neutrino beams have been produced for more than 50 years, famously allowing the discovery of the muon neutrino at Brookhaven National Laboratory in 1962. The difficulty in knowing the energy of a neutrino beam, as opposed to the situation at colliders such as the LHC, stems from the way the beams are produced. First, a high-current proton beam is fired into a thick target to produce many secondary particles such as charged pions and kaons, which are emitted in the forward direction. A device called a magnetic horn, invented by Simon van der Meer at CERN in 1961, then bundles the charged particles into a given direction as they decay into neutrinos and their corresponding charged leptons. Once the leptons have been removed by appropriate absorber materials, a neutrino beam emerges.

Whereas a particle beam in a high-energy accelerator has a diameter of about 10 μm, the width of the neutrino beam at its origin is determined by the dimensions of the horn, which is typically of the order of 1 m. Since the pions and kaons are produced with their own energy spectra, which have been measured by experiments such as HARP and NA61/SHINE at CERN, their two-body decays into a charged lepton and a neutrino lead to a broad neutrino-energy distribution. By the time a neutrino beam reaches a long-baseline detector, it may be as wide as a few kilometres and its energy is known only in broad ranges. For example, the beam envisaged for DUNE will have a distribution of energies with a maximum at about 2.5 GeV, with tails all the way down to 0 GeV on one side and 30 GeV on the other. While the high-energy tail may be small, the neutrino–nucleon cross-section in this region scales roughly linearly with the neutrino energy, so that even small tails contribute to interactions in the detector (figure 1).

The neutrino energy is a key parameter in the formula governing neutrino-oscillation probabilities and must be reconstructed on an event-by-event basis. In a “clean” two-body reaction such as νμ + n μ + p, where a neutrino undergoes quasi-elastic (QE) scattering off a neutron at rest, the neutrino energy can be determined from the kinematics (energy and angle) of the outgoing muon alone. This kinematical or QE-based approach requires a sufficiently good detector to make sure that no inelastic excitations of the nucleon have taken place. Alternatively, the so-called calorimetric method measures the energies of all the outgoing particles to yield the incoming neutrino energy. Since both methods suffer from less-than-perfect detectors with limitations in acceptance and efficiency, the reconstructed energy may not be equal to the true energy and detector simulations are therefore essential.

A major additional complication comes about because all modern neutrino experiments use nuclear targets, such as water in T2K and argon in DUNE. Even assuming that the neutrino–nucleus interaction can be described as a superposition of quasi-free interactions of the neutrino with individual nucleons, the latter are bound and move with their Fermi motion. As a result, the kinematical method suffers because the initial-state neutron is no longer at rest but moves with a momentum of up to about 225 MeV, smearing the reconstructed neutrino energy around its true value by a few tens of MeV. Furthermore, final-state interactions concerning the hadrons produced – both between themselves and with the nuclear environment of the detector – significantly complicate the energy reconstruction procedure. Even true initial QE events cannot be distinguished from events in which first a pion is produced and then is absorbed in the nuclear medium (figure 2), and the kinematical approach to energy reconstruction necessarily leads to a wrong neutrino energy.

The calorimetric method, on the other hand, suffers because detectors often do not see all particles at all energies. Here, the challenge to determine the neutrino energy is to “calculate backwards” from the final state, which is only partly known due to detector imperfections, to the incoming state of the reaction. One can gain an impression of how good this backwards calculation has to be by considering figure 3, which shows the sensitivity of the oscillation signal to changes in the CP-violating phase angle: for DUNE and T2K an accuracy of about 100 MeV and 50 MeV is required, respectively, to distinguish between the various curves showing the expected oscillation signal for different assumptions about the phase and the neutrino mass-ordering. At the same time, one sees that the oscillation maxima have to be determined within about 20% to be able to measure the phase and mass-ordering.

Detectors near and far

Neutrino physicists working on long-baseline experiments have long been aware of the problems in extracting the oscillation signal. The standard remedy is to build a detector not only at the oscillation distance (called the far detector, FD) but also one close to the neutrino production target (the near detector, ND). By dividing the event rates seen in the FD by those in the ND, the oscillation probability follows directly. The division also leads one to hope that uncertainties in our knowledge of cross-sections and reaction mechanisms cancel out, making the resulting probability less sensitive to uncertainties in the energy reconstruction. In practice, however, there are obstacles to such an approach. For instance, often the ND contains a different active material and has a different geometry to the FD, the latter simply because of the significant broadening of the neutrino beam with distance between the ND and the FD. Furthermore, due to the oscillation the energy spectrum of neutrinos is different in the ND than it is in the FD. It is therefore vital that we have a good understanding of the interactions in different target nuclei and energy regimes because the energy reconstruction has to be done separately both at the ND and the FD.

To place neutrino–nucleus reactions on a more solid empirical footing, neutrino researchers have started to measure the relevant cross-sections to a much higher accuracy than was possible at previous experiments such as CERN’s NOMAD. MiniBooNE at Fermilab has provided the world’s largest sample of charged-current events (QE-like reactions and pion production) on a target consisting of mineral oil, for example, and the experiment is now being followed by the nearby MicroBooNE, which uses an argon target. At higher energies, the MINERvA experiment (also at Fermilab) is dedicated to determining neutrino–nucleus cross-sections in an energy distribution that peaks at about 3.5 GeV and is thus close to that expected for DUNE. Cross-section measurements are also taking place in the NDs of T2K and NOvA, which are crucial to our understanding of neutrino–nucleus interactions and for benchmarking new neutrino generators.

Neutrino generators provide simulations of the entire neutrino–nucleus interaction, from the very first initial neutrino–nucleon interaction to the final state of many outgoing and interacting hadrons, and are needed to perform the backwards computation from the final state to the initial state. Such generators are also needed to estimate effects of detector acceptance and efficiency, similar to the role of GEANT in other nuclear and high-energy experiments. These generators should be able to describe all of the interactions over the full energy range of interest in a given experiment, and should also be able to describe neutrino–nucleus and hadron–nucleus interactions involving different nuclei. Obviously, the generators should therefore be based on state-of-the-art nuclear physics, both for the nuclear structure and for the actual reaction process.

Presently used neutrino generators, such as NEUT or GENIE, deal with the final-state interactions by employing Monte Carlo cascade codes in which the nucleons move freely between collisions and nuclear-structure information is kept at a minimum. The challenge here is to deal correctly with questions such as relativity in many-body systems, nuclear potentials and possible in-medium changes of the hadronic interactions. Significant progress has been made recently in describing the structure of target nuclei and their excitations in so-called Green’s function Monte Carlo theory. A similarly sophisticated approach to the hadronic final-state interactions is provided by the non-equilibrium Green’s function method. This method, the foundations of which were written down half a century ago, is the only known way to describe high-multiplicity events while taking care of possible in-medium effects on the interaction rates and the off-shell transport between collisions. Only during the last two decades have numerical implementations of this quantum-kinetic transport theory become possible. A neutrino generator built on this method (GiBUU) has recently been used to explore the uncertainties that are inherent in the kinematical-energy reconstruction method for the very same process shown in figure 3, and the result of that study (figure 4) gives an idea of the errors inherent in such an energy reconstruction.

Generators that contain all the physics of neutrino–nucleus interactions are absolutely essential to get at the neutrino’s intriguing properties in long-baseline experiments. The situation is comparable to that in experiments at the LHC and at the Relativistic Heavy Ion Collider at Brookhaven that study the quark–gluon plasma (QGP). Here, the existence and properties of the QGP can be inferred only by calculating backwards from the final-state observations with “normal” hadrons to the hot and dense collision zone with gluons and quarks. Without a full knowledge of the neutrino–nucleus interactions the neutrino energy in current and future long-baseline neutrino experiments cannot be reliably reconstructed. Thus, generators for neutrino experiments should clearly be of the same quality as the corresponding experimental apparatus. This is where the expertise and methods of nuclear physicists are needed in experiments with neutrino beams.

A natural test bed for these generators is provided by data from electron–nucleus reactions. These test the vector part of the neutrino–nucleus interaction and thus constitute a mandatory test for any generator. Experiments with electrons at JLAB are presently in a planning stage. Since the energy reconstruction has to start from the final state of the reaction, the four-vectors of all final-state particles are needed for the backwards calculation to the initial state. Inclusive lepton–nucleus cross-sections, with no information on the final state, are therefore not sufficient.

Call to action

All of this has been realised only recently and there is now a growing community that tries to bring together experimental high-energy physicists that work on long-baseline experiments with nuclear theorists. There is a dedicated conference series called NUINT, in addition to meetings such as WIN or NUFACT, which now all have sessions on neutrino–nucleus interactions.

We face a challenge that is completely new to high-energy physics experiments: the reconstruction of the incoming energy from the final state requires a good description of the nuclear ground state, control of neutrino–nucleus interactions and, on top of all this, control of the final-state interactions of the hadrons when they cascade through the nucleus after the primary neutrino–nucleon interaction. Neutrino generators that fulfil all of these requirements can minimise the uncertainties in the energy reconstruction. They should therefore attract the same attention and support as the development of new equipment for long-baseline neutrino experiments, since their quality ultimately determines the precision of the extracted neutrino properties.

CERN strengthens ties with South Asia

Particle physics is evolving as a result of greater co-ordination and collaboration on a global scale. This goes hand in hand with CERN’s policy of increased networking with organisations and institutions worldwide. In 2010, the CERN Council approved a radical shift in CERN’s membership policy that opened full membership to non-European states, irrespective of their geographical location. At the same time, the Council introduced the status of associate membership to facilitate the accession of new members, including emerging countries outside of Europe, which might not command sufficient resources to sustain full membership (CERN Courier December 2014 p58). Interest in membership and associate membership of CERN continues to grow (CERN Courier January/February 2017 p5).

CERN’s geographical enlargement policy offers clear opportunities to reinforce the long-term aspirations of the high-energy physics community. Enlargement is not an aim in itself. Rather, the focus is on strengthening relations with countries that can bring scientific and technological expertise to CERN and, in return, allow countries with developing particle-physics communities to build capacity. Presently, CERN has 22 Member States, seven associate Member States, and six states and organisations with observer status. From the South Asia region, Pakistan and India have recently become associate members. International Co-operation Agreements (ICAs) have been signed with Bangladesh, Nepal and Sri Lanka.

The first CERN South Asian High Energy Physics Instrumentation (SAHEPI) workshop on detector technology and applications, held on 20–21 June at Nepal’s Kathmandu University, Dhulikhel, brought together physicists and policy makers from the South Asia region and Mauritius. Representatives from Afghanistan, Bangladesh, Bhutan, India, the Maldives, Mauritius, Nepal, Pakistan and Sri Lanka were invited to attend. At least one senior scientist and one student from each country participated, making a total of about 70 participants, more than half of whom were students (representatives from the Maldives and Bhutan were not able to attend due to other commitments).

The aim of the workshop was to bring together representatives from CERN and South Asia countries to strengthen the scientific co-operation between the Organization and the South Asia region. The workshop also provided the opportunity for countries to establish new contacts within the region, with the objective of initiating new intra-regional co-operation in high-energy physics and related technologies. The workshop was initiated as part of CERN’s broader efforts to establish relations with regions, complementing relations with individual countries, and follows similar regional approaches in Latin America and South-east Asia.

Rewards of participation

Progress in particle physics relies on close collaboration between physicists, technicians, hardware and software engineers and industry, and both Member States and associate Member States enjoy opportunities to apply for staff and fellowship positions and bid for CERN contracts. Collaborating with CERN also trains young scientists and engineers in cutting-edge projects, giving them expertise that can be brought to their home nations, and offers great opportunities for educating the next generation through CERN’s teacher programmes and others. Participation in CERN programmes has already fostered successful scientific collaborations in South Asia, where researchers have participated in many of CERN’s pioneering activities and made a significant contribution to the construction of the LHC. Indeed, CERN’s relations with South Asia feature strong partnerships dating back decades, particularly with India and Pakistan.

In 1994, CERN and the government of Pakistan signed an ICA concerning the development of scientific and technical co-operation in CERN’s research. This agreement was followed by the signing of  several protocols, and today Pakistan contributes to the ALICE and CMS experiments as well as to accelerator projects such as CLIC/CTF3 and Linac 4, making Pakistan a significant partner for CERN. Pakistan has also built various mechanical components for ATLAS and for the LHC, and made an important contribution to the LHC consolidation programme in 2013–2014. In July 2015 Pakistan  became an associate Member State of CERN.

Given its long tradition and broad spectrum of co-operation with CERN since the 1970s, combined with the country’s substantial scientific and technological potential, India applied for associate membership in 2015. In particular, in 1996 the Indian Atomic Energy Commission (AEC) agreed to take part in the construction of the LHC, and to contribute to the CMS and ALICE experiments and to the LHC Computing Grid with Tier-2 centres in Mumbai and Kolkata. In recognition of its contribution to the construction of the LHC, India was awarded observer status in 2002. The success of the partnership between CERN and the Indian Department of Atomic Energy (DAE) regarding the LHC has also led to co-operation on novel accelerator technologies through DAE’s participation in CERN’s Linac 4, Superconducting Proton Linac (SPL) and CTF3 projects, and CERN’s contribution to DAE’s programmes. India is also participating in the COMPASS, ISOLDE and n_TOF experiments. India became an associate Member State in January 2017.

Broader region

In the recent past, contacts have also been established with several other countries in the South Asia region. Collaboration between Sri Lanka and CERN has been ongoing for a number of years, following an expression of interest signed between the CMS collaboration and the University of Ruhuna in 2006. This saw the first CERN summer students from Sri Lanka and the first PhD student graduating in 2013. The conclusion of the ICA with the government of Sri Lanka in 2017, and the expected entry of a consortium of universities from Sri Lanka into the CMS collaboration, are important steps for growing the country’s national high-energy physics capacity. Sri Lanka is moving towards membership of the CMS collaboration with an interest in contributing to the experiment’s upgrade programme, driven initially by the University of Colombo and the University of Ruhuna.

Official contacts between CERN and Bangladesh were established in 2012 with the signing of an expression of interest. This provided an interim framework to enable scientists, engineers and students from universities and research institutes of Bangladesh to further develop their professional training, in particular through participation in CERN’s scientific and training programmes. In 2014 CERN and Bangladesh signed an ICA, and the first Bangladesh–CERN school on particle physics was held at the University of Dhaka that year. Bangladesh is currently participating in the ALICE experiment as an associate member and working on physics analysis. There is a high potential for future development of the collaboration with CERN, through 39 public universities and 93 recognised universities and the high number of multidisciplinary research facilities of the Bangladesh Atomic Energy Commission.

Nepal has seven universities: Tribhuvan University (the largest and, until around 1990, the only university in the country); Kathmandu University; Pokhara University; Purbanchal University; Mahendra Sanskrit University; Far-western University; and the Agriculture and Forestry University. Tribhuvan and Kathmandu universities are considered to be best placed to develop scientific and technical co-operation with CERN, and collaborative activities are already under way between CERN and Nepal. Students from Nepal have been participating in the CERN summer-student programme for non-Member State nationals, and teachers from Nepal have taken part in the CERN high-school teachers programme. Several workshops on particle physics and related areas have been held at Tribhuven University and Kathmandu University, and an ICA between Nepal and CERN has recently been concluded.

Contacts with the physics communities of Afghanistan, Bhutan, the Maldives and Mauritius were established through the June workshop and will be pursued to explore ways and means to develop capacity in physics and related areas in these countries. The department of physics at the University of Kabul was established in 1942 and includes programmes for undergraduate and graduate studies in physics as well as student laboratories. The department of physics at the University of Mauritius includes research programmes in radioastronomy and applied radio frequency and it also operates the Mauritius Radio Telescope, while its undergraduate teaching programme includes courses in nuclear and elementary particle physics.

Moving forward

The nature of the national programmes in high-energy physics and related areas in these South Asian countries was the focus of discussions at the Kathmandu workshop. The aim was to identify synergies to establish stronger scientific collaboration across the region, such as promoting the exchange of researchers and students within the region. These efforts will help to build capacity, particularly in the countries with emerging high-energy physics programmes that face challenges in research infrastructure and university curricula.

The motivation and enthusiasm of the participants was palpable, and it was clear to see the efforts they put in for research and education. The proceedings of the workshop together with the identified strengths, weaknesses, opportunities and threats of the national programmes in high-energy physics and possibilities to strengthen the intra-regional co-operation will be presented to representatives of the governments from the participating countries with the objective to raise awareness at the highest political level.

CERN will continue to engage with the region in an effort to build capacity in high-energy physics research and education and to facilitate collaboration with CERN. It was decided that due to the success of this workshop, discussions will continue in Sri Lanka in 2019.

Injecting new life into the LHC

The Large Hadron Collider (LHC) is the most famous and powerful of all CERN’s machines, colliding intense beams of protons at an energy of 13 TeV. But its success relies on a series of smaller machines in CERN’s accelerator complex that serve it. The LHC’s proton injectors have already been providing beams with characteristics exceeding the LHC’s design specifications. This decisively contributed to the excellent performance of the 2010–2013 LHC physics operation and, since 2015, has allowed CERN to push the machine beyond its nominal beam performance.

Built between 1959 and 1976, the CERN injector complex accelerates proton beams to a kinetic energy of 450 GeV. It does this via a succession of accelerators: a linear accelerator called Linac 2 followed by three synchrotrons – the Proton Synchrotron Booster (PSB), the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS). The complex also provides the LHC with ion beams, which are first accelerated through a linear accelerator called Linac 3 and the Low Energy Ion Ring (LEIR) synchrotron before being injected into the PS and the SPS. The CERN injectors, besides providing beams to the LHC, also serve a large number of fixed-target experiments at CERN – including the ISOLDE radioactive-beam facility and many others.

Part of the LHC’s success lies in the flexibility of the injectors to produce various beam parameters, such as the intensity, the spacing between proton bunches and the total number of bunches in a bunch train. This was clearly illustrated in 2016 when the LHC reached peak luminosity values 40% higher than the design value of 1034 cm–2 s–1, although the number of bunches in the LHC was still about 27% below the maximum achievable. This gain was due to the production of a brighter beam with roughly the same intensity per bunch but in a beam envelope of just half the size.

Despite the excellent performance of today’s injectors, the beams produced are not sufficient to meet the very demanding proton beam parameters specified by the high-luminosity upgrade of the LHC (HL-LHC). Indeed, as of 2025, the HL-LHC aims to accumulate an integrated luminosity of around 250 fb–1 per year, to be compared with the 40 fb–1 achieved in 2016. For heavy-ion operations, the goals are just as challenging: with lead ions the objective is to obtain an integrated luminosity of 10 nb–1 during four runs starting from 2021 (compared to the 2015 achievement of less than 1 nb–1). This has demanded a significant upgrade programme that is now being implemented.

Immense challenges

To prepare the CERN accelerator complex for the immense challenges of the HL-LHC, the LHC Injectors Upgrade project (LIU) was launched in 2010. In addition to enabling the necessary proton and ion injector chains to deliver beams of ions and protons required for the HL-LHC, the LIU project must ensure the reliable operation and lifetime of the injectors throughout the HL-LHC era, which is expected to last until around 2035. Hence, the LIU project is also tasked with replacing ageing equipment (such as power supplies, magnets and radio-frequency cavities) and improving radioprotection measures such as shielding and ventilation.

One of the first challenges faced by the LIU team members was to define the beam-performance limitations of all the accelerators in the injector chain and identify the actions needed to overcome them by the required amount. Significant machine and simulation studies were carried out over a period of years, while functional and engineering specifications were prepared to provide clear guidelines to the equipment groups. This was followed by the production of the first hardware prototype devices and their installation in the machines for testing and, where possible, early exploitation.

Significant progress has already been made concerning the production of ion beams. Thanks to the modifications in Linac 3 and LEIR implemented after 2015 and the intensive machine studies conducted within the LIU programme over the last three years, the excellent performance of the ion injector chain could be further improved in 2016 (figure 1). This enabled the recorded luminosity for the 2016 proton–lead run to exceed the target value by a factor of almost eight. The main remaining challenges for the ion beams will be to more than double the number of bunches in the LHC through complex RF manipulations in the SPS known as “momentum slip stacking”, as well as to guarantee continued and stable performance of the ion injector chain without constant expert monitoring.

Along the proton injector chain, the higher-intensity beams within a comparatively small beam envelope required by the HL-LHC can only be demonstrated after the installation of all the LIU equipment during Long Shutdown 2 (LS2) in 2019–2020. The main installations feature: a new injection region, a new main power supply and RF system in the PSB; a new injection region and RF system to stabilise the future beams in the PS; an upgraded main RF system; and the shielding of vacuum flanges together with partial coating of the beam chambers in order to stabilise future beams against parasitic electromagnetic interaction and electron clouds in the SPS. Beam instrumentation, protection devices and beam dumps also need to be upgraded in all the machines to match the new beam parameters. The baseline goals of the LIU project to meet the challenging HL-LHC requirements are summarised in the panel (final page of feature).

Execution phase

Having defined, designed and endorsed all of the baseline items during the last seven years, the LIU project is presently in its execution phase. New hardware is being produced, installed and tested in the different machines. Civil-engineering work is proceeding for the buildings that will host the new PSB main power supply and the upgraded SPS RF equipment, and to prepare the area in which the new SPS internal beam dump will be located.

The 86 m-long Linac 4, which will eventually replace Linac 2, is an essential component of the HL-LHC upgrade (see panel opposite). The machine, based on newly developed technology, became operational at the end of 2016 following the successful completion of acceleration tests at its nominal energy of 160 MeV. It is presently undergoing an important reliability run that will be instrumental to reach beams with characteristics matching the requirements of the LIU project and to achieve an operational availability higher than 95%, which is an essential level for the first link in the proton injector chain. On 26 October 2016, the first 160 MeV negative hydrogen-ion beam was successfully sent to the injection test stand, which operated until the beginning of April 2017 and demonstrated the correct functioning of this new and critical CERN injection system as well as of the related diagnostics and controls.

The PSB upgrade has mostly completed the equipment needed for the injection of negative hydrogen ions from Linac 4 into the PSB and is progressing with the 2 GeV energy upgrade of the PSB rings and extraction, with a planned installation date of 2019–2020 during LS2. On the beam-physics side, studies have mainly focused on the deployment of the new wideband RF system, commissioning of beam diagnostics and investigation of space-charge effects. During the 2016–2017 technical stop, the principal LIU-related activities were the removal of a large volume of obsolete cables and the installation of new beam instrumentation (e.g. a prototype transverse size measurement device and turn-by-turn orbit measurement systems). The unused cables, which had been individually identified and labelled beforehand, could be safely removed from the machine to allow cables for the new LIU equipment to be pulled.

The procurement, construction, installation and testing of upgrade items for the PS is also progressing. Some hardware, such as new corrector magnets and power supplies, a newly developed beam gas-ionisation monitor and new injection vacuum chambers to remove aperture limitations, was already installed during past technical stops. Mitigating anticipated longitudinal beam instabilities in the PS is essential for achieving the LIU baseline beam parameters. This requires that the parasitic electromagnetic interaction of the beam with the multiple RF systems has to be reduced and a new feedback system has to be deployed to keep the beam stable. Beam-dynamics studies will determine the present intensity reach of the PS and identify any remaining needs to comfortably achieve the value required for the HL-LHC. Improved schemes of bunch rotation are also under investigation to better match the beam extracted from the PS to the SPS RF system and thus limit the beam losses at injection energy in the SPS.

In the SPS, the LIU deployment in the tunnel has begun in earnest, with the re-arrangement and improvement of the extraction kicker system, the start of civil engineering for the new beam-dump system in LSS5 and the shielding of vacuum flanges in 10 half-cells together with the amorphous carbon coating of the adjacent beam chambers (to mitigate against electron-cloud effects). In a notable first, eight dipole and 10 focusing quadrupole magnet chambers were amorphous carbon coated in-situ during the 2016–2017 technical stop, proving the industrialisation of this process (figure 2). The new overground RF building needed to accommodate the power amplifiers of the upgraded main RF system has been completed, while procurement and testing of the solid-state amplifiers has also commenced. The prototyping and engineering for the LIU beam-dump is in progress with the construction and installation of a new SPS beam-dump block, which will be able to cope with the higher beam intensities of the HL-LHC and minimise radiation issues.

Regarding diagnostics, the development of beam-size measurement devices based on flying wire, gas ionisation and synchrotron radiation, all of which are part of the LIU programme, is already providing meaningful results (figure 3) addressing the challenges of measuring the operating high-intensity and high-brightness beams with high precision. From the machine performance and beam dynamics side, measurements in 2015–2016 made with the very high intensities available from the PS meant that new regimes were probed in terms of electron-cloud instabilities, RF power and losses at injection. More studies are planned in 2017–2018  to clearly identify a path for the mitigation of the injection losses when operating with higher beam currents.

Looking forward to LS2

The success of LIU in delivering beams with the desired parameters is the key to achieving the HL-LHC luminosity target. Without the LIU beams, all of the other necessary HL-LHC developments – including high-field triplet magnets, crab cavities and new collimators – would only allow a fraction of the desired luminosity to be delivered to experiments.

Whenever possible, LIU installation work is taking place during CERN’s regular year-end technical stops. But the great majority of the upgrade requires an extended machine stop and therefore will have to wait until LS2 for implementation. The duration of access to the different accelerators during LS2 is being defined and careful preparation is ongoing to manage the work on site, ensure safety and level the available resources among the different machines in the CERN accelerator complex. After all of the LIU upgrades are in place, beams will be commissioned with the newly installed systems. The LIU goals in terms of beam characteristics are, by definition, uncharted territory. Reaching them will require not only a high level of expertise, but also careful optimisation and extensive beam-physics and machine-development studies in all of CERN’s accelerators.

Linac 4 complete and preparing to serve the high-luminosity LHC

Inaugurated in May, Linac 4 is CERN’s newest accelerator acquisition since the LHC and is key to increasing the luminosity delivered to the LHC. Linac 4 will send negative hydrogen ions with an energy of 160 MeV – more than three times the energy of its predecessor Linac 2 (which has been in service since 1978) – to the Proton Synchrotron Booster, which further accelerates the negative ions and removes the electrons. Almost 90 m long, it took nearly 10 years to build. After an extensive testing period that is already under way, Linac 4 will be connected to CERN’s accelerator complex during the long technical shutdown in 2019–2020.

Overhauling CERN’s accelerator complex

The demands of the high-luminosity LHC (HL-LHC) will see major changes across CERN’s accelerator complex (above). The LHC Injectors Upgrade project is organised into five baseline work packages to boost the performance of the LHC injectors and match the challenging HL-LHC requirements:• Improving ion source and low-energy transport in Linac 3, alleviating ion losses in LEIR and SPS, and implementing momentum slip stacking for ion beams in the SPS.

• Replacing Linac 2 with Linac 4 and using H charge-exchange injection into the PSB at the increased energy of 160 MeV.

• Raising the injection energy into the PS from the present 1.4 GeV to 2 GeV.

• Doubling the RF power, reducing the longitudinal impedance and mitigating the electron cloud in the SPS.

• Putting in place all of the other necessary equipment and operational upgrades across PSB, PS and SPS to make them capable of accelerating and manipulating higher-intensity/brightness beams (e.g. intercepting and dump devices, feedback systems, beam instrumentation and resonance compensation).

 

CERN’s recipe for knowledge transfer

Understanding what the universe is made of and how it started are the fundamental questions behind CERN’s existence. This quest alone makes CERN a unique knowledge-focused organisation and an incredible human feat. To achieve its core mission, CERN naturally creates new opportunities for innovation. A myriad of engineers, technicians and scientists develop novel technology and know-how that can be transferred to industry for the benefit of society. Twenty years ago, with the support of CERN Council, a reinforced structure for knowledge and technology transfer was established to strengthen these activities.

Advances in fields including accelerators, detectors and computing have had a positive impact outside of CERN. Although fundamental physics might not seem the most obvious discipline in which to find technologies with marketable applications, the many examples of applications of CERN’s technology and know-how – whether in medical technology, aerospace, safety, the environment and “industry 4.0” – constitute concrete evidence that high-energy physics is a fertile ground for innovation. That CERN’s expertise finds applications in multinational companies, small and medium enterprises and start-ups alike is further proof of CERN’s broader impact (see “From the web to a start-up near you”)

As an international organisation, CERN has access to a wealth of diverse viewpoints, skills and expertise. But what makes CERN different from other organisations in other fields of research? Sociologists have long studied the structure of scientific organisations, several using CERN as a basis, and they find that particle-physics collaborations uniquely engage in “participatory collaboration” that brings added value in knowledge generation, technology development and innovation. This type of collaboration, along with the global nature of the experiments hosted by CERN, adds high value to the laboratory’s knowledge-transfer activities.

Despite its achievements in knowledge transfer internationally, CERN is seldom listed in international innovation rankings; when it is present, it is never at the top. This is mainly a selection effect due to methodology. For example, the Reuters “Top 25 Global Innovators – Government” ranking relies on patents as a proxy for innovation (of the 10 innovation criteria used, seven are based on patents). CERN’s strategy is to focus on open innovation and to maximise the dissemination of our technologies and know-how, rather than focus on revenue. Although there is a wide range of intellectual-property tools useful for knowledge transfer, patent volume is not a relevant measure of successful intellectual-property management at CERN.

Instead, the CERN Knowledge Transfer group measures the number of new technology disclosures (91 in 2016), and the number of contracts and agreements signed with external partners and industry (42 in 2016, and totalling 251 since 2011). We also monitor spin-off and start-up companies – there are currently 18 using CERN technology, some of which are hosted directly in CERN’s network of business incubation centres. Together with the impressive breadth of application fields of CERN technologies, we believe these are clearer measures of impact

In the future, CERN will continue to pursue and promote open innovation. We want to build a culture of entrepreneurship whereby more people leaving CERN consider starting a business based on CERN technologies, and use a wide range of metrics to quantify our innovation. Strong links with industry are important to help reinforce a market-pull rather than technology-push approach. The Knowledge Transfer group will also continue to provide a service to the CERN community through advice, support, training, networks and infrastructure for those who wish to engage with industry through our activities.   

Human capital is vital in our equation, since knowledge transfer cannot happen without CERN’s engineers, technicians and physicists. Our role is to facilitate their participation, which could start with a visit to our new website, an Entrepreneurship Meet-Up (EM-U), or a visit to one of our seminars. Since they were launched roughly two years ago, EM-Us and knowledge-transfer seminars have together attracted more than 2000 people. Whether you want to tell us about an idea you have, or are curious about the impact of our technologies on society, we hope to hear from you soon.

• Find out more at kt.cern.

An Overview of Gravitational Waves: Theory, Sources and Detection

By Gerard Auger and Eric Plagnol (eds)
World Scientific

51ufUkgRCsL._SX312_BO1,204,203,200_

In 2016, the first direct detection of gravitational waves – produced more than a billion years ago during the coalescence of two black holes of stellar origin – by the two detectors of the LIGO experiment was a tremendous milestone in the history of science. This timely book provides an overview of the field, presenting the basics of the theory and the main detection techniques.

The discovery of gravitational radiation is extraordinarily important, not only for confirming the key predictions of Einstein’s general relativity, but also for its implications. A new window on the universe is opening up, with more experiments – already built or in the planning stage – joining the effort to perform precise measurements of gravitational waves.

The book, composed of eight chapters, collects the contributions of many experts in the field. It first introduces the theoretical basics needed to follow the discussion on gravitational waves, so that no prior knowledge of general relativity is required. A long chapter dedicated to the sources of such radiation accessible to present and future observations follows. A section is then devoted to the principles of gravitational-wave detection and to the description of present and future Earth- and space-based detectors. Finally, an alternative detection technique based on cold atom interferometry is presented.

The Meaning of the Wave Function: In Search of the Ontology of Quantum Mechanics

By Shan Gao
Cambridge University Press

413x+E4UVzL._SX355_BO1,204,203,200_

Does the wave function directly represent a state of reality, or merely a state of (incomplete) knowledge of it, or something else? This question is the starting point of this book, in which the author – a professor of philosophy – aims to make sense of the wave function in quantum mechanics and investigate the ontological content of the theory. A very powerful mathematical object, the wave function has always been the focus of a debate that goes beyond physics and mathematics to the philosophy of science.

The first part of the book (chapters 1–5) deals with the nature of the wave function and provides a critical review of its competing interpretations. In the second part (chapters 6 and 7), the author focuses on the ontological meaning of the wave function and proposes his view, which is that the wave function in quantum mechanics is real and represents the state of random discontinuous motion of particles in 3D space. He offers two main arguments supporting this new interpretation. The third part (chapters 8 and 9) is devoted to investigating possible implications. In particular, the author discusses whether the quantum ontology described by the wave function is enough to account for our definite experience, or whether additional elements, such as many worlds or hidden variables, are needed.

Aimed at readers familiar with the basics of quantum mechanics, the book could also appeal to students and researchers interested in the philosophical aspects of modern science theories.

Problem Solving in Quantum Mechanics: From Basics to Real-World Applications for Materials Scientists, Applied Physicists, and Device Engineers

By Marc Cahay and Supriyo Bandyopadhyay
Wiley

1118988752

With the rapid development of nanoscience and nano-engineering, quantum mechanics can no longer be considered exclusively the interest of physicists. Indeed, a fundamental understanding of physical phenomena at the nanoscale will require future electronic engineers, condensed-matter physicists and material scientists to master the fundamental principles of quantum theory.

Noticing that many textbooks on quantum mechanics are not meant for a wide audience of scientists, in particular those interested in practical applications and technologies at the nanoscale, the authors decided to fill this gap. In particular, they focus on the solution of problems that students and researchers working on state-of-the-art material and device applications might have to face. The problems are grouped by theme in 13 chapters, each completed by a section of further readings.

An ideal resource for graduate students, the book is also of value to professionals who need to update their knowledge or to refocus their expertise towards nanotechnologies.

bright-rec iop pub iop-science physcis connect