Comsol -leaderboard other pages

Topics

Manufacturing and delivery of components for the new heavy-ion synchrotron SIS100 at FAIR has begun

The international Facility for Antiproton and Ion Research in Europe (FAIR) (CERN Courier May 2007 p23) is currently under construction at GSI, in Darmstadt, Germany. The FAIR accelerators will deliver antiproton and ion beams of unprecedented intensities and qualities to perform heavy-ion and antimatter research. The driver accelerator of FAIR is a fast-ramping, superconducting synchrotron, SIS100, which allows the acceleration of high-intensity beams of stable elements from protons (29 GeV) to uranium (11 GeV/u). SIS100 will be installed in an underground tunnel and all of the services will be installed in a parallel supply tunnel.

The delivery of components for SIS100 commenced at the end of 2015. On 21 December, AURION in Seeligenstadt delivered the first of nine magnetic-alloy bunch-compression cavities. In a combined site/factory acceptance test at GSI, approval for series production is now in preparation.

As the first Polish in-kind contribution, the first piece of cryogenic-bypass line, made at the Wroclaw University of Technology, was delivered in February. After delivery, the bypass line will undergo acceptance tests at GSI.

The site acceptance test of the first of a series of fast-ramped, dipole magnets is in its final stage (see figure). The results available so far indicate high mechanical precision and excellent performance of the superconducting coil. Following successful results, series production has started, and the first devices are expected to be delivered by the middle of 2016. The series devices will be tested at the new test facility at GSI, which has been set up for cold testing of FAIR magnets. In accordance with the contracts, many other SIS100 components will be delivered in 2016, including the first of a series of superconducting quadrupoles from JINR (Dubna, Russia), resonance sextupole magnets, acceleration cavities, magnet chambers, cryo-catcher and cryo-absorption pumps, and many others.

The realisation phase of the SIS100 project is fully under way, and the work is proceeding according to schedule. The production of accelerator components is expected to take a maximum of four years.

LUNA observes a rare nuclear reaction that occurs in giant red stars

In December, the Laboratory for Underground Nuclear Astrophysics (LUNA) experiment reported the first direct observation of sodium production in giant red stars, one of the nuclear reactions that are fundamental to the formation of the elements that make up the universe.

LUNA is a compact linear accelerator for light ions (maximum energy 400 keV). A unique facility, it is installed in a deep-underground laboratory and shielded from cosmic rays. The experiment aims to study the nuclear reactions that take place inside stars, where elements that make up matter are formed and then driven out by gigantic explosions and scattered as cosmic dust.

For the first time, LUNA has observed three low-energy resonances in the neon-sodium cycle, the 22Ne(p,γ)23Na reaction, responsible for sodium production in red giants and energy generation. LUNA recreates the energy ranges of nuclear reactions and, with its accelerator, goes back in time to one hundred million years after the Big Bang, when the first stars formed and the processes that gave rise to the huge variety of elements in the universe started.

This result is an important piece in the puzzle of the origin of the elements in the universe, which LUNA has been studying for 25 years. Stars assemble atoms through a complex system of nuclear reactions. A very small fraction of these reactions have been studied at the energies existing inside of the stars, and a large part of those few cases have been observed using LUNA.

A high-purity germanium detector with relative efficiency up to 130% was used for this particular experiment, together with a windowless gas target filled with enriched gas. The rock surrounding the underground facility at the Gran Sasso National Laboratory and additional passive shielding protected the experiment from cosmic rays and ambient radiation, making the direct observation of such a rare process possible.

DAMPE joins the search for dark matter in space

On 17 December, the Chinese Academy of Sciences (CAS) successfully launched the DArk Matter Particle Explorer (DAMPE) satellite from the Jiuquan Satellite Launch Center in northwest China, marking the entrance of a new player in the global hunt for dark matter.

The nature of dark matter is one of the most fundamental questions of modern science, and many experiments have been set up to unravel this mystery, using either large underground detectors or at colliders (for example at the LHC), or with space missions (for example, AMS, CERN Courier November 2014 p6, or CALET, CERN Courier November 2015 p11).

DAMPE is the first science satellite launched by CAS. Built with advanced particle-detection technologies, DAMPE will extend the dark-matter search in space into the multi-TeV region. It will measure electrons and photons in the 5 GeV–10 TeV range with unprecedented energy resolution (1.5% at 100 GeV), to find dark-matter annihilation in these channels. It will also measure precisely the flux of nuclei up to above 100 TeV, which will bring new insights into the origin and propagation of high-energy cosmic rays. With its excellent photon-detection capability, the DAMPE mission is also well placed for new discoveries in high-energy γ-ray astronomy. The DAMPE collaboration consists of Chinese (Purple Mountain Observatory, University of Science and Technology, Institute of High Energy Physics, Institute of Modern Physics, Lanzhou, National Space Science Center) and European (University of Geneva, INFN Perugia, Bari and Lecce) institutes.

The DAMPE detector weighs 1.4 tonnes and consumes 400 W. It consists of, from top to bottom, a plastic scintillator detector (PSD) that serves as an anti-coincidence detector, a silicon-tungsten tracker-converter (STK), a BGO imaging calorimeter of about 31 radiation lengths, and a neutron detector (NUD). The STK, which improves the tracking and photon detection capability of DAMPE greatly, was proposed and designed by the European team and was constructed in Europe, in collaboration with IHEP, in a record time of two years. DAMPE became a CERN-recognised experiment in March 2014 and has profited greatly from the CERN test-beam facilities, both in the Proton Synchrotron and the Super Proton Synchrotron. In fact, CERN provided more than 60 days of beam from July 2012 to December 2015, allowing DAMPE to calibrate its detector extensively with various types of particles, with energy raging from 1 to 400 GeV.

Three days after the launch, on 20 December, the STK was powered on, and four days later, the high voltage of the calorimeter was also turned on. To the satisfaction of the collaboration, all of the detector sub-systems functioned very well, and in-orbit commissioning is now well under way to tune the detector to optimal condition for the three-year observation period. A great deal of data collection, process and analysis lie ahead, but thanks to CERN, we can look forward to a well-callibrated DAMPE detector to produce exciting new measurements in the very near future.

Is there a ‘ninth planet’ after all?

Pluto was considered to be the ninth planet of the solar system, until it was relegated to a “dwarf planet” by the International Astronomical Union (IAU) in 2006. It was judged to be too small among many other trans-Neptunian objects to be considered a real planet. Almost 10 years later, two astronomers have now found indications of the presence of a very distant heavy planet orbiting the Sun. While it is still to be detected, it is already causing a great deal of excitement in the scientific community and beyond.

Pluto was discovered in 1930 by a young American astronomer, Clyde Tombaugh, who tediously looked at innumerable photographic plates to detect an elusive planet moving relative to background stars. With the progressive discovery – since the 1990s – of hundreds of objects orbiting beyond Neptune, Pluto is no longer alone in the outer solar system. It even lost its status of the heaviest trans-Neptunian object with the discovery of Eris in 2003. This forced the IAU to rethink the definition of a planet and led to the exclusion of Pluto from the strict circle of eight planets.

Eris is not the only massive trans-Neptunian object found by Mike Brown, an astronomer of the California Institute of Technology (Caltech), US, and colleagues. There are also Quaoar (2002), Sedna (2003), Haumea (2004) and Makemake (2005), all only slightly smaller than Pluto and Eris. Despite these discoveries, almost nobody during recent years would have thought that there could still be a much bigger real planet in the outskirts of our solar system. But this is what Mike Brown and one of his colleagues, the theorist Konstantin Batygin, now propose.

The evidence comes from an unexpected clustering of perihelion positions and orbital planes of a group of objects just outside of the orbit of Neptune

The two astronomers deduced the existence of a ninth planet through mathematical modelling and computer simulations, but have not yet observed the object directly. The evidence comes from an unexpected clustering of perihelion positions and orbital planes of a group of objects just outside of the orbit of Neptune, in the so-called Kuiper belt. All six objects with the most elongated orbits – with semi-major axes greater than 250 AU – share similar perihelion positions and pole orientations. The combined statistical significance of this clustering is 3.8σ, assuming that Sedna and the five other peculiar planetoids have the same observational bias as other known Kuiper-belt objects.

Batygin and Brown then show that a planet with more than about 10 times the mass of the Earth in a distant eccentric orbit anti-aligned with the six objects would maintain the peculiar configuration of their orbits. This possible ninth planet would rotate around the Sun about 20 times further out than Neptune, therefore completing one full orbit only approximately once every 10,000 years. Batygin’s simulations of the effect of this new planet further predict the existence of a population of small planetoids in orbits perpendicular to the plane of the main planets. When Brown realised that such peculiar objects exist and have indeed already been identified, he became convinced about the existence of Planet Nine.

Observers now know along which orbit they should look for Planet Nine. If it happens to be found, this would be a major discovery: the third planet to be discovered since ancient times after Uranus and Neptune and, as with the latter, it would have been first predicted to exist via calculations.

CERN’s new management begins a five-year term

Résumé

La nouvelle Direction du CERN entame un mandat de cinq ans

Plusieurs défis attendent le CERN entre 2016 et 2020. L’arrêt technique hivernal des accélérateurs se terminant fin mars, l’étude de la physique de l’après-Higgs va pouvoir commencer au LHC. Parallèlement, les autres accélérateurs et expériences vont permettre au Laboratoire de conserver un programme scientifique diversifié et passionnant. Pour le CERN, cela voudra dire maintenir le cap sur les plans technique et financier pour le projet HL-LHC et les améliorations des injecteurs, aussi bien pour les accélérateurs que pour les expériences. Le vaste programme de collaboration avec la communauté scientifique mondiale sera enrichi grâce à des études et des projets comme l’étude FCC, le CLIC et AWAKE. Le CERN contribuera aussi à la recherche sur les neutrinos en dehors de l’Europe grâce à la “plateforme neutrino”. Selon la Directrice générale du CERN, ces années seront décisives pour “commencer à bâtir l’avenir à long terme de la physique des particules”.

Several challenges lie ahead for CERN during the years 2016–2020. With the winter technical stop of the accelerators coming to an end in March, the voyage of true post-Higgs physics exploration can start at the LHC. In the meantime, all of the other accelerators and experiments will continue to ensure that the scientific programme of the laboratory remains as diverse and compelling as it has always been.

For CERN, this means ensuring that the High-Luminosity LHC project and injector upgrades remain technically on track and financially secure, for both the accelerators and the experiments.

The rich programme of collaboration with the worldwide scientific community will be enhanced through studies and projects like the FCC study, CLIC and AWAKE. Beyond the lab, CERN will contribute to neutrino research outside of Europe through the CERN neutrino platform.

In the words of the Director-General, these years will be crucial “to start building the long-term future of particle physics”.

Fabiola Gianotti – Director-General

In 1989, Fabiola Gianotti was awarded a PhD in experimental particle physics from the University of Milan, and went on to become an eminent physicist with more than 500 authored or co-authored publications in peer-reviewed scientific journals.

Gianotti has been a research physicist in the Physics Department of CERN since 1994 – when she joined as a fellow – and since then has been involved in several CERN experiments, detector R&D and construction, as well as software development and data analysis.

From 2009 to 2013, she held the elected position of spokesperson for the ATLAS experiment, and was honoured to announce the discovery of the Higgs boson in a seminar at CERN on 4 July 2012.

During her career she has also been a member of several international committees, such as the Scientific Council of the CNRS (France), the Physics Advisory Committee of the Fermilab Laboratory (USA), the Council of the European Physical Society, the Scientific Council of the DESY Laboratory (Germany), and the Scientific Advisory Committee of NIKHEF (Netherlands). She is also a member of the Scientific Advisory Board of the UN secretary-general, Mr Ban Ki-moon, and of both the US National and the Italian Academy of Sciences (Accademia Nazionale dei Lincei).

Since 2012, Gianotti has been bestowed with several awards, including the Special Fundamental Physics Prize of the Milner Foundation (2012), the Enrico Fermi Prize of the Italian Physical Society (2013) and the Medal of Honour of the Niels Bohr Institute of Copenhagen (2013). She was also awarded the honour of “Cavaliere di Gran Croce dell’ordine al merito della Repubblica” by the Italian President.

Gianotti’s influence and success have also led to her being ranked 5th in Time magazine’s “Personality of the Year 2012”, included in the Guardian’s 2011 “Top 100 most inspirational women” and Forbes magazine’s 2013 “Top 100 most inspirational women” lists, and is considered one of the “Leading Global Thinkers of 2013” by Foreign Policy magazine (USA, 2013).

On 1 January, she became the first female Director-General of CERN.

Frédérick Bordry – Director for Accelerators and Technology

In 1978, Frédérick Bordry graduated with a PhD in electrical engineering from the Institut National Polytechnique in Toulouse, and went on to gain his higher doctorate in science from the same institute in 1985.

Bordry’s early career was spent teaching and conducting energy conversion research. Then he moved to Brazil, where he spent two years as a professor at the Federal University of Santa Catarina (Florianópolis). In 1981, he was appointed senior lecturer at the Institut National Polytechnique in Toulouse.

Bordry came to CERN in 1986, joining the group working on power converters for the Large Electron–Positron Collider (LEP), before moving in 1988 to the Operations Group as an engineer in charge of the Super Proton Synchrotron and LEP.

In 1994, the year that the LHC was approved, he joined the Power Converter Group as the head of power converters design and construction for the LHC. He was appointed leader of the Power Converter Group in 2002, a position he held until December 2008.

In 2009, Bordry was promoted to head of the CERN Technology Department – responsible for technologies specific to existing particle accelerators, facilities and future projects – where he has remained until 2013.

From 2014, he acted as the director for accelerators and technology, where he is responsible for the operation and exploitation of the whole CERN accelerator complex, with particular emphasis on the LHC and for the development of new projects and technologies. He was re-appointed CERN’s Director for Accelerators and Technology.

Eckhard Elsen – Director for Research and Computing

Eckhard Elsen obtained his PhD in particle physics from Hamburg University in 1981.

Elsen’s research focused initially on e+e collider particle physics and led him to prominent postdoctoral positions at Hamburg University, SLAC National Accelerator Laboratory, and Heidelberg University, where he first made contact with CERN as a member of the OPAL collaboration.

In 1990, Elsen was promoted to senior scientist for the Deutsches Elektronen-Synchrotron (DESY), in Germany. During this time, he became the spokesperson for the H1 experiment (an international collaboration that developed and built the H1 detector at the ep-collider HERA at DESY), and later – after a sabbatical at the BaBar experiment at Stanford – project manager for the International Linear Collider (ILC) project team at DESY, when Elsen continued his relationship with CERN.

In 2006, Elsen was made a professor at Hamburg University, where he taught both general physics courses and accelerator physics, and supervised students.

Elsen has co-authored two books (the most recent on the physics harvest of the LHC Run 1), worked on more than 450 publications in various fields of particle physics, and participated in many scientific committees – including chairing the LHC experiments committee from 2011 to 2014.

Martin Steinacher – Director for Finance and Human Resources

Martin Steinacher studied physics, mathematics and astronomy before going on to gain his doctorate in experimental physics at the University of Basel.

After completing his studies, Steinacher moved to the University of Berne, where he worked for seven years as a scientific collaborator on space-research projects.

Then he continued as a civil servant at the Foreign Ministry, where he acted as a delegate for Switzerland and was responsible for planning the Swiss financial contribution to the European Space Agency (ESA), European Southern Observatory (ESO) and other international organisations.

These skills led to Steinacher being appointed the scientific adviser for the Federal Office for Education and Science, before being appointed the deputy head of international co-operation at the State Secretariat for Education and Research.

In his role as chairman of the CERN Finance Committee, Steinacher worked closely with CERN member states, which led to the unanimous approval of a new method to calculate the annual scale of contribution.

In 2013, Steinacher was promoted to head of the International Research Organisations Unit, giving him high-level roles as senior scientific administrator in the ESO and ESRF Councils. His achievements while in this position include helping to negotiate Poland’s accession to ESO and also securing a funding agreement for the Swiss participation in the European Spallation Source project, until 2026.

Charlotte Lindberg Warakaulle – Director for International Relations

Since 2001, Charlotte Warakaulle has held a variety of posts at the United Nations, from associate speechwriter to chief of the Political Affairs and Partnerships Section of the United Nations Office at Geneva.

During her time in this post, she was a key focal point for relations between CERN and the United Nations Office at Geneva, and was closely involved in the first-ever UNOG-CERN Co-operation Agreement, signed in 2011.

She was also a linchpin in the preparations for CERN obtaining observer

status with the General Assembly at the United Nations in 2012.

Most recently, she took on the position of chief of the United Nations Library in Geneva, where she was responsible for library services, knowledge management, cultural diplomacy and intellectual outreach.

Prior to her work with the United Nations, Warakaulle held a Carlsberg visiting research fellowship at Lucy Cavendish College at the University of Cambridge from 1998 to 2001.

During her time at the University of Cambridge, she also served as editor-in-chief of the Cambridge Review of International Affairs, a peer-reviewed international affairs journal then published by the Centre of International Studies at the University of Cambridge.

She gained her MPhil in international relations at the University of Cambridge (Pembroke College), and also holds an MA in history (cand.mag.) from the University of Copenhagen, as well as an MA in history (coursework) from the University of Sydney and a BA in history from the University of Copenhagen.

SHiP sets a new course in intensity-frontier exploration

 

SHiP is an experiment aimed at exploring the domain of very weakly interacting particles and studying the properties of tau neutrinos. It is designed to be installed downstream of a new beam-dump facility at the Super Proton Synchrotron (SPS). The CERN SPS and PS experiments Committee (SPSC) has recently completed a review of the SHiP Technical and Physics Proposal, and it recommended that the SHiP collaboration proceed towards preparing a Comprehensive Design Report, which will provide input into the next update of the European Strategy for Particle Physics, in 2018/2019.

Why is the SHiP physics programme so timely and attractive? We have now observed all the particles of the Standard Model, however it is clear that it is not the ultimate theory. Some yet unknown particles or interactions are required to explain a number of observed phenomena in particle physics, astrophysics and cosmology, the so-called beyond-the-Standard Model (BSM) problems, such as dark matter, neutrino masses and oscillations, baryon asymmetry, and the expansion of the universe.

While these phenomena are well-established observationally, they give no indication about the energy scale of the new physics. The analysis of new LHC data collected at √ = 13 TeV will soon have directly probed the TeV scale for new particles with couplings at O(%) level. The experimental effort in flavour physics, and searches for charged lepton flavour violation and electric dipole moments, will continue the quest for specific flavour symmetries to complement direct exploration of the TeV scale.

However, it is possible that we have not observed some of the particles responsible for the BSM problems due to their extremely feeble interactions, rather than due to their heavy masses. Even in the scenarios in which BSM physics is related to high-mass scales, many models contain degrees of freedom with suppressed couplings that stay relevant at much lower energies.

Given the small couplings and mixings, and hence typically long lifetimes, these hidden particles have not been significantly constrained by previous experiments, and the reach of current experiments is limited by both luminosity and acceptance. Hence the search for low-mass BSM physics should also be pursued at the intensity frontier, along with expanding the energy frontier.

SHiP is designed to give access to a large class of interesting models. It has discovery potential for the major observational puzzles of modern particle physics and cosmology, and can explore some of the models down to their natural “bottom line”. SHiP also has the unique potential to test lepton flavour universality by comparing interactions of muon and tau neutrinos.

SPS: the ideal machine

SHiP is a new type of intensity-frontier experiment motivated by the possibility to search for any type of neutral hidden particle with mass from sub-GeV up to O(10) GeV with super-weak couplings down to 10–10. The proposal locates the SHiP experiment on a new beam extraction line that branches off from the CERN SPS transfer line to the North Area. The high intensity of the 400 GeV beam and the unique operational mode of the SPS provide ideal conditions. The current design of the experimental facility and estimates of the physics sensitivities assume the SPS accelerator in its present state. Sharing the SPS beam time with other SPS fixed-target experiments and the LHC should allow 2 × 1020 protons on target to be produced in five years of nominal operation.

The key experimental parameters in the phenomenology of the various hidden-sector models are relatively similar. This allows common optimisation of the design of the experimental facility and of the SHiP detector. Because the hidden particles are expected to be predominantly accessible through the decays of heavy hadrons and in photon interactions, the facility is designed to maximise their production and detector acceptance, while providing the cleanest possible environment. As a result, with 2 × 1020 protons on target, the expected yields of different hidden particles greatly exceed those of any other existing and planned facility in decays of both charm and beauty hadrons.

As shown in the figure (left), the next critical component of SHiP after the target is the muon shield, which deflects the high flux of muon background away from the detector. The detector for the hidden particles is designed to fully reconstruct the exclusive decays of hidden particles and to reject the background down to below 0.1 events in the sample of 2 × 1020 protons on target. The detector consists of a large magnetic spectrometer located downstream of a 50 m-long and 5 × 10 m-wide decay volume. To suppress the background from neutrinos interacting in the fiducial volume, the decay volume is maintained under a vacuum. The spectrometer is designed to accurately reconstruct the decay vertex, mass and impact parameter of the decaying particle at the target. A set of calorimeters followed by muon chambers provide identification of electrons, photons, muons and charged hadrons. A dedicated high-resolution timing detector measures the coincidence of the decay products, which allows the rejection of combinatorial backgrounds. The decay volume is surrounded by background taggers to detect neutrino and muon inelastic scattering in the surrounding structures, which may produce long-lived SM V0 particles, such as KL, etc. The experimental facility is also ideally suited for studying interactions of tau neutrinos. The facility will therefore host a tau-neutrino detector largely based on the Opera concept, upstream of the hidden-particle decay volume (CERN Courier November 2015 p24).

Global milestones and next steps

The SHiP experiment aims to start data-taking in 2026, as soon as the SPS resumes operation after Long Shutdown 3 (LS3). The 10 years consist, globally, of three years for the comprehensive design phase and then, following approval, a bit less than five years of civil engineering, starting in 2021, in parallel with four years for detector production and staged installation of the experimental facility, and two years to finish the detector installation and commissioning.

The key milestones during the upcoming comprehensive design phase are aimed at further optimising the layout of the experimental facility and the geometry of the detectors. This involves a detailed study of the muon-shield magnets and the geometry of the decay volume. It also comprises revisiting the neutrino background in the fiducial volume, together with the background detectors, to decide on the required type of technology for evacuating the decay volume. Many of the milestones related to the experimental facility are of general interest beyond SHiP, such as possible improvements to the SPS extraction, and the design of the target and the target complex. SHiP has already benefitted from seven weeks of beam time in test beams at the PS and SPS in 2015, for studies related to the Technical Proposal (TP). A similar amount of beam time has been requested for 2016, to complement the comprehensive design studies.

The SHiP collaboration currently consists of almost 250 members from 47 institutes in 15 countries. In only two years, the collaboration has formed and taken the experiment from a rough idea in the Expression of Interest to an already mature design in the TP. The CERN task force, consisting of key experts from CERN’s different departments, which was launched by the CERN management in 2014 to investigate the implementation of the experimental facility, brought a fundamental contribution to the TP. The SHiP physics case was demonstrated to be very strong by a collaboration of more than 80 theorists in the SHiP Physics Proposal.

The intensity frontier greatly complements the search for new physics at the LHC. In accordance with the recommendations of the last update of the European Strategy for Particle Physics, a multi-range experimental programme is being actively developed all over the world. Major improvements and new results are expected during the next decade in neutrino and flavour physics, proton-decay experiments and measurements of the electric dipole moments. CERN will be well-positioned to make a unique contribution to exploration of the hidden-particle sector with the SHiP experiment at the SPS.

• For further reading, see cds.cern.ch/record/2007512.

The eye that looks at galaxies far, far away

Night is falling over Cerro Paranal, a 2600 m peak within the mountain range running along Chile’s Pacific coastline. As our eyes gradually become accustomed to total obscurity and we start to catch a glimpse of the profile of the domes on top of the Cerro, we are overwhelmed by the breathtaking view of the best starry sky we have ever seen. The centre of the Milky Way is hanging over our heads, together with the two Magellanic Clouds and the four stars of the Southern Cross. The galactic centre is so star-dense that it looks rather like a 3D object suspended in the sky.

Not a single artificial light source is polluting the site, which is literally in the middle of nowhere, because the closest inhabited area is about 130 km away. The air in the austral winter in the Atacama desert is cold, but there is almost no wind, and no noise can be heard as I walk in the shadow of four gigantic (30 m tall) metal domes housing the four 8.2 m-diameter fixed unit telescopes (UTs) and four 1.8 m-diameter movable auxiliary telescopes (ATs), that make up the Very Large Telescope (VLT). Yet dozens of astronomers are working not far away, in a building right below the platform on top of the Cerro, overlooking the almost permanent cloud blanket over the Pacific Ocean.

As we enter the control room, I immediately feel a sense of déjà vu: a dozen busy and mostly young astronomers are drinking coffee, eating crisps and talking in at least three different languages, grouped around five islands of computer terminals.

Welcome to the nerve centre of the most complex and advanced optical telescope in the world. From here, all of the instrumentation is remotely controlled through some 100 computers connected to the telescopes by bunches of optical fibres. Four islands are devoted to the operation of all of the components of the VLT telescopes, from their domes to the mirrors and the imaging detectors, and the fifth is entirely devoted to the controls of interferometry.

 

Highly specialised ESO astronomers take their night shifts in this room 300 nights per year, on average. Most observations are done in service mode (60–70% of the total time), with ESO staff doing observations for other astronomers within international projects that have gone through an evaluation process and have been approved. The service mode guarantees full flexibility to reschedule observations and match them with the most suitable atmospheric conditions. The rest of the time is “visitor mode”, with the astronomer in charge of the project leading the observations, which is particularly useful whenever any real-time decision is needed.

The shift leader tonight is an Italian from Padova. He swaps from one screen to the next, trying to ignore the television crew’s microphones and cameras, while giving verbal instructions to a young Australian student. He is activating one of the VLT’s adaptive-optics systems, hundreds of small pistons positioned under the mirrors to change their curvature up to thousands of times per second, to counteract any distortion caused by atmospheric turbulence. “Thanks to adaptive optics, the images obtained with the VLT are as sharp as if we were in space,” he explains briefly, before leaning back on one of the terminals.

Complex machinery

Adaptive optics is not the only astronomers’ dream come true at the VLT. The VLT’s four 8.2 m-diameter mirrors are the largest single-piece light-collecting surface in the world, and the best application of active optics – the trick ESO scientists use to correct for gravitationally induced deformations as the telescope changes its orientation and so maintain the optics of the vast surface. The telescope mirrors are controlled by an active support system powered by more than 250 computers, working in parallel and positioned locally in each structure, to apply the necessary force to the mirrors to maintain their alignment with one another. The correcting forces have a precision of 5 g and keep the mirror in the ideal position, changing it every 3 minutes with 10 nm precision. The forces are applied on the basis of the analysis of the image of a real star, taken during the observations, so that the telescope is self-adjusting. The weight of the whole structure is incredibly low for its size. The 8.2 m-diameter reflecting surface is only 17 cm thick, and the whole mirror weighs 22 tonnes; its supporting cell weighs only 10 tonnes. Another technological marvel is the secondary mirror, a single-piece lightweight hyperbolic mirror that can move in all directions along five degrees of freedom. With its 1.2 m diameter, it is the second largest object entirely made in beryllium, after the Space Shuttle doors.

But the secret of the VLT’s uniqueness lies in a tunnel under the platform. Optical interferometry is the winning idea that enables the VLT to achieve yet unsurpassed ultra-high image resolution, by combining the light collected by the main 8.2 m UTs and the 1.8 m ATs. The physics principle behind the idea stems from Young’s 19th century two-slit experiment, and was first applied to radio astronomy, where wavelengths are long. But in the wavelength domains of visible and infrared light, interferometry becomes a much greater challenge. It is interesting to note that the idea of using optical interferometry became a real option for the VLT at the ESO conference held at CERN in 1977 (cf Claus Madsen The Jewel on the Mountain Top Wiley-VCH).

With special permission from the director and taking advantage of a technical stop to install a new instrument, we are able to visit the interferometry instrumentation room and tunnel under the platform – a privilege granted to few. The final instrument that collects and analyses all of the light coming from the VLT telescopes, after more than 25 different reflections, is kept like a jewel in a glass box in the instrumentation room. Nobody can normally get this close to it, because even the turbulence generated by a human presence can disturb its high-precision work. Following the path of the light, we enter the interferometry tunnel. The dominant blue paint of the metal rails and the size of the tunnel trigger once again an inevitable sense of déjà vu. Three horizontal telescopes travel seamlessly on two sets of four 60 m-long rails – the “delay lines” where the different arrival times of photons on each of the telescopes is compensated for with ultra precision. These jewels of technology move continuously along the rails without electric contact, thanks to linear engines with coils interacting directly with the magnets in the engine; no cable is connected to the telescopes on the rails because the signals are transmitted by laser, and electricity is conveyed by the rails themselves to enable precision and smooth movement. The system is so precise that it can detect and automatically adapt to earthquakes, and measure the vibrations provoked in the mountain by the waves of the Pacific Ocean 12 km away. Nowhere else has interferometry reached such complexity and been pushed so far.

Delivering science at a high rate

The resolution obtained by the Very Large Telescope Interferometer (VLTI – the name given to the telescopes when they function in this mode) is equivalent to the resolution of a 100 m-diameter mirror. Moreover, the Auxiliary Telescopes are mounted on tracks, and can move over the entire telescope platform, enabling the VLTI to obtain an even better final resolution. The combined images of the 4+4 telescopes allow the same light collection capacity as a much larger individual mirror, therefore making the VLT the largest optical instrument in the world.

Up to 15% of refereed science papers based on ESO data are authored by researchers not involved in the original data generation

Another revolution introduced by the VLT has to do with e-science. The amount of data generated by the new high-capacity VLT science instruments drove the development of end-to-end models in astronomy, introducing electronic proposal submission and service observing with processed and raw science and engineering data fed back to everyone involved. The expansion of the data links in Latin America enabled the use of high-speed internet connections spanning continents, and ESO has been able to link its observatories to the data grid. “ESO practises an open-access policy (with regulated, but limited propriety rights for science proposers) and holds public-survey data as well. Indeed, it functions as a virtual observatory on its own,” says Claus Madsen, senior counsellor for international relationships at ESO. Currently, up to 15% of refereed science papers based on ESO data are authored by researchers not involved in the original data generation (e.g. as proposers), and an additional 10% of the papers are partly based on archival data. Thanks also to this open-access policy, the VLT has become the most productive ground-based facility for astronomy operating at visible wavelengths, with only the Hubble Space Telescope generating more scientific papers.

Watch the video at https://cds.cern.ch/record/2128425.

Data sonification enters the biomedical field

Résumé

La sonification des données fait son entrée dans le domaine biomédical

La musique et les sciences de la vie ont beaucoup d’affinités : les deux disciplines font intervenir les concepts de cycles, de périodicité, de fluctuations, de transition et même, curieusement, d’harmonie. En utilisant la technique de la sonification, les scientifiques sont capables de percevoir et de quantifier la coordination des mouvements du corps humain, ce qui permet d’améliorer notre connaissance et notre compréhension du contrôle moteur en tant que système dynamique auto-organisé passant par des états stables et instables en fonction de changements dans les contraintes s’exerçant au niveau de l’organisme, des tâches et de l’environnement.

Resonances, periodicity, patterns and spectra are well-known notions that play crucial roles in particle physics, and that have always been at the junction between sound/music analysis and scientific exploration. Detecting the shape of a particular energy spectrum, studying the stability of a particle beam in a synchrotron, and separating signals from a noisy background are just a few examples where the connection with sound can be very strong, all sharing the same concepts of oscillations, cycles and frequency.

In 1619, Johannes Kepler published his Harmonices Mundi (the “harmonies of the world”), a monumental treatise linking music, geometry and astronomy. It was one of the first times that music, an artistic form, was presented as a global language able to describe relations between time, speed, repetitions and cycles.

The research we are conducting is based on the same ideas and principles: music is a structured language that enables us to examine and communicate periodicity, fluctuations, patterns and relations. Almost every notion in life sciences is linked with the idea of cycles, periodicity, fluctuations and transitions. These properties are naturally related to musical concepts such as pitch, timbre and modulation. In particular, vibrations and oscillations play a crucial role, both in life sciences and in music. Take, for example, the regulation of glucose in the body. Insulin is produced from the pancreas, creating a periodic oscillation in blood insulin that is thought to stop the down-regulation of insulin receptors in target cells. Indeed, these oscillations in the metabolic process are so key that constant inputs of insulin can jeopardise the system.

Oscillations are also the most crucial concept in music. What we call “sound” is the perceived result of regular mechanical vibrations happening at characteristic frequencies (between 20 and 20,000 times per second). Our ears are naturally trained to recognise the shape of these oscillations, their stability or variability, the way they combine and their interactions. Concepts such as pitch, timbre, harmony, consonance and dissonance, so familiar to musicians, all have a formal description and characterisation that can be expressed in terms of oscillations and vibrations.

Many human movements are cyclic in nature. An important example is gait – the manner of walking or running. If we track the position of any point on the body in time, for example the shoulder or the knee, we would see it describing a regular, cyclic movement. If the gait is stable, as in walking at a constant speed, the frequency associated would be regular, with small variations due to the inherent variability of the system. By measuring, for example, the vertical displacement of the centre of each joint while walking or running, we would have a series of one-dimensional oscillating waveforms. The collection of these waveforms provides a representation of co-ordinated movement of the body. Studying their properties, such as phase relations, frequencies and amplitudes, then provides a way to investigate the order parameters that define modes of co-ordination.

Previous methods of examining the relation between components of the body have included statistical techniques such as principal-component analysis, or analysis of coupled oscillators through vector coding or continuous relative phase. However, a problem is that data are lost using statistical techniques, and small variations due to the inherent variability of the system are ignored. Conversely, a coupled oscillator can cope only with two components contributing to the co-ordination.

Sonograms to study body movements

Our approach is based on the idea of analysing the waveforms and their relations by translating them into audible signals and using the natural capability of the ear to distinguish, characterise and analyse waveform shapes, amplitudes and relations. This process is called data sonification, and one of the main tools to investigate the structure of the sound is the sonogram (sometimes also called a spectrogram). A sonogram is a visual representation of how the spectrum of a certain sound signal changes with time, and we can use sonograms to examine the phase relations between a large collection of variables without having to reduce the data. Spectral analysis is a particularly relevant tool in many scientific disciplines, for example in high-energy physics, where the interest lies in energy spectra, pattern and anomaly detections, and phase transitions.

Using a sonogram to examine the movement of multiple markers on the body in the frequency domain, we can obtain an individual and situation-specific representation of co-ordination between the major limbs. Because anti-phase frequencies cancel, in-phase frequencies enhance each other, and a certain degree of variability in the phase of the oscillation results in a band of frequencies, we are able to represent the co-ordination within the system through the resulting spectrogram.

In our study, we can see exactly this. A participant ran on a treadmill that was accelerating between speeds of 0 and 18 km/h for two minutes. A motion-analysis system was used to collect 3D kinematic data from 24 markers placed bilaterally on the head, neck, shoulders, elbows, wrists, hand, pelvis, hips, knees, heels, ankles and toes of the participant (sampling frequency 100 Hz, trial length 120 s). Individual and combined sensor measurements were resampled to generate audible waveforms. Sonograms were then computed using moving-frequency Hanning analysis windows for all of the sound signals computed for each marker and combination of markers.

Sonification of individual and combined markers is shown above right. Sonification of an individual marker placed on the left knee (top left in the figure) shows the frequencies underpinning the marker movement on that particular joint-centre. By combining the markers, say of a whole limb such as the leg, we can examine the relations of single markers, through the cancellation and enhancement of frequencies involved. The result will show some spectral lines strengthening, others disappearing and others stretching to become bands (top right). The nature of the collective movements and oscillations that underpin the mechanics of an arm or a leg moving regularly during the gait can then be analysed through the sound generated by the superposition of the relative waveforms.

A particularly interesting case appears when we combine audifications of marker signals coming from opposing limbs, for example left leg/right arm or right leg/left arm. The sonogram bottom left in the figure is the representation of the frequency content of the oscillations related to the combined sensors on the left leg and the right arm (called additive synthesis, in audio engineering). If we compare the sonogram of the left leg alone (top right) and the combination with the opposing arm, we can see that some spectral lines disappear from the spectrum, because of the phase opposition between some of the markers, for example the left knee and the right elbow, the left ankle and the right hand.

The final result of this cancellation is a globally simpler dynamical system, described by a smaller number of frequencies. The frequencies themselves, their sharpness (variability) and the point of transition provide key information about the system. In addition, we are able to observe and hear the phase transition between the walking and running state, indicating that our technique is suitable for examining these order-parameter states. By examining movement in the frequency domain, we obtain an individual and situation-specific representation of co-ordination between the major limbs.

Sonification of movement as audio feedback

Sonification, as in the example above, does not require data reduction. It can provide us with unique ways of quantifying and perceiving co-ordination in human movement, contributing to our knowledge and understanding of motor control as a self-organised dynamical system that moves through stable and unstable states in response to changes in organismic, task and environmental constraints. For example, the specific measurement described above is a tool to increase our understanding of the adaptability of human motor control to something like a prosthetic limb. The application of this technique will aid diagnosis and tracking of pathological and perturbed gait, for example highlighting key changes in gait with ageing or leg surgeries.

In addition, we can also use sonification of movements as a novel form of audio feedback. Movement is key to healthy ageing and recovery from injuries or even pathologies. Physiotherapists and practitioners prescribe exercises that take the human body through certain movements, creating certain forces. The precise execution of these exercises is fundamental to the expected benefits, and while this is possible under the watchful eye of the physiotherapist, it can be difficult to achieve when alone at home.

In precisely executing exercises, there are three main challenges. First, there is the patient’s memory of what the correct movement or exercise should look like. Second, there is the ability of the patient to execute correctly the movement that they are required to do, working the right muscles to move the joints and limbs through the correct space, over the right amount of time or through an appropriate amount of force. Last, finding the motivation to perform sometimes painful, strenuous or boring exercises, sometimes many times a day, is a challenge.

Sonification can provide not only real-time audio feedback but also elements of feed-forward, which provides a quantitative reference for the correct execution of movements. This means that the patient has access to a map of the correct movements through real-time feedback, enabling them to perform correctly. And let’s not forget about motivation. Through sonification, in response to the movements, the patient can generate not only waveforms but also melodies and sounds that are pleasing.

Another possible application of generating melodies associated with movement is in the artistic domain. Accelerometers, vibration sensors and gyroscopes can turn gestures into melodic lines and harmonies. The demo organised during the public talk of the International Conference on Translational Research in Radio-Oncology – Physics for Health in Europe (ICTR-PHE), on 16 February in Geneva, was based on that principle. Using accelerometers connected to the arm of a flute player, we could generate melodies related to the movements naturally occurring when playing, in a sort of duet between the flute and the flutist. Art and science and music and movement seem to be linked in a natural but profound way by a multitude of different threads, and technology keeps providing the right tools to continue the investigation just as Kepler did four centuries ago.

Charting the future of CERN

Over the next five years, key events shaping the future of particle physics will unfold. We will have results from the second run of the LHC, and from other particle and astroparticle physics projects around the world. These will help us to chart the future scientific road map for our field. The international collaboration that is forming around the US neutrino programme will crystallise, bringing a new dimension to global collaboration in particle physics. And initiatives to host major high-energy colliders in Asia should become clear. All of this will play a role in shaping the next round of the European Strategy for Particle Physics, which will in turn shape the future of our field in Europe and at CERN.

CERN is first and foremost an accelerator laboratory. It is there that we have our greatest experience and concentration of expertise, and it is there that we have known our greatest successes. I believe that it is also there that CERN’s future lies. Whether or not new physics emerges at the LHC, and whether or not a new collider is built in Asia, CERN should aim to maintain its pre-eminence as an accelerator lab exploring fundamental physics.

CERN’s top priority for the next five years is ensuring a successful LHC Run 2, and securing the financial and technical development and readiness of the High-Luminosity LHC project. This does not mean that CERN should compromise its scientific diversity. Quite the opposite: our diversity underpins our strength. CERN’s programme today is vibrant, with unique facilities such as the Antiproton Decelerator and ISOLDE, and experiments studying topics ranging from kaons to axions. This is vital to our intellectual life, and it is a programme that will evolve and develop as physics needs dictate. Furthermore, with the new neutrino platform, CERN is contributing to projects hosted outside of Europe, notably the exciting neutrino programme underway at Fermilab.

If CERN is to retain its position as a focal point for accelerator-based physics in the decades to come, we must continue to play a leading role in global efforts to develop technologies to serve a range of possible physics scenarios. These include R&D on superconducting high-field magnets, high-gradient, high-efficiency accelerating structures, and novel acceleration technologies. In this context, AWAKE is a unique project using CERN’s high-energy, high-intensity proton beams to investigate the potential of proton-driven plasma wakefield acceleration for the very-long-term future. In parallel, CERN is playing a leading role in international design studies for future high-energy colliders that could succeed the LHC in the medium-to-long term. Circular options, with colliding electron–positron and proton–proton beams, are covered by the Future Circular Collider (FCC) study, while the Compact Linear Collider (CLIC) study offers potential technology for a linear electron–positron option reaching the multi-TeV range. To ensure a future programme that is compelling, and scientifically diverse, we are putting in place a study group that will investigate future opportunities other than high-energy colliders, making full use of the unique capabilities of CERN’s rich accelerator complex, while being complementary to other endeavours around the world. Along with the developments I mention above, these studies will also provide valuable input into the next update of the European Strategy, towards the end of this decade.

Global planning in particle physics has advanced greatly over recent years, with European, US and Japanese strategies broadly aligning, and the processes that drive them becoming ever more closely linked. For particle physics to secure its long-term future, we need to continue to promote strong worldwide collaborations, develop synergies, and bring new and emerging players, for example in Asia, into the fold.

Within that broad picture, CERN should steer a course towards a future based on accelerators. Any future accelerator facility will be an ambitious undertaking, but that should not deter us. We should not abandon our exploratory spirit just because the technical and financial challenges are intimidating. Instead, we should rise to the challenge, and develop the innovative technologies needed to make our projects technically and financially feasible.

Classical Dynamics: A Modern Perspective (2nd edition)

By E C G Sudarshan and N Mukunda
World Scientific

61tMwwJgiuL

More than 40 years since the appearance of the first edition, this book in now published in a revised version that is presented with the same passion and dedication as the original. The authors confess that they have always had an “affair of the heart” with classical dynamics, and this remains alive.

In the volume, classical dynamics is treated as a subject in its own right, as well as a research frontier. While presenting all of the essential principles, the authors demonstrate that a number of key results originally considered only in the context of quantum theory and particle physics have their foundations in classical dynamics.

Even if the text is based on what the authors define as “our understanding of quantum mechanics”, this new version builds on many suggestions coming from other physicists and continuous dialogue with students using the book as a reference.

bright-rec iop pub iop-science physcis connect