Bluefors – leaderboard other pages

Topics

SHiP sets a new course in intensity-frontier exploration

 

SHiP is an experiment aimed at exploring the domain of very weakly interacting particles and studying the properties of tau neutrinos. It is designed to be installed downstream of a new beam-dump facility at the Super Proton Synchrotron (SPS). The CERN SPS and PS experiments Committee (SPSC) has recently completed a review of the SHiP Technical and Physics Proposal, and it recommended that the SHiP collaboration proceed towards preparing a Comprehensive Design Report, which will provide input into the next update of the European Strategy for Particle Physics, in 2018/2019.

Why is the SHiP physics programme so timely and attractive? We have now observed all the particles of the Standard Model, however it is clear that it is not the ultimate theory. Some yet unknown particles or interactions are required to explain a number of observed phenomena in particle physics, astrophysics and cosmology, the so-called beyond-the-Standard Model (BSM) problems, such as dark matter, neutrino masses and oscillations, baryon asymmetry, and the expansion of the universe.

While these phenomena are well-established observationally, they give no indication about the energy scale of the new physics. The analysis of new LHC data collected at √ = 13 TeV will soon have directly probed the TeV scale for new particles with couplings at O(%) level. The experimental effort in flavour physics, and searches for charged lepton flavour violation and electric dipole moments, will continue the quest for specific flavour symmetries to complement direct exploration of the TeV scale.

However, it is possible that we have not observed some of the particles responsible for the BSM problems due to their extremely feeble interactions, rather than due to their heavy masses. Even in the scenarios in which BSM physics is related to high-mass scales, many models contain degrees of freedom with suppressed couplings that stay relevant at much lower energies.

Given the small couplings and mixings, and hence typically long lifetimes, these hidden particles have not been significantly constrained by previous experiments, and the reach of current experiments is limited by both luminosity and acceptance. Hence the search for low-mass BSM physics should also be pursued at the intensity frontier, along with expanding the energy frontier.

SHiP is designed to give access to a large class of interesting models. It has discovery potential for the major observational puzzles of modern particle physics and cosmology, and can explore some of the models down to their natural “bottom line”. SHiP also has the unique potential to test lepton flavour universality by comparing interactions of muon and tau neutrinos.

SPS: the ideal machine

SHiP is a new type of intensity-frontier experiment motivated by the possibility to search for any type of neutral hidden particle with mass from sub-GeV up to O(10) GeV with super-weak couplings down to 10–10. The proposal locates the SHiP experiment on a new beam extraction line that branches off from the CERN SPS transfer line to the North Area. The high intensity of the 400 GeV beam and the unique operational mode of the SPS provide ideal conditions. The current design of the experimental facility and estimates of the physics sensitivities assume the SPS accelerator in its present state. Sharing the SPS beam time with other SPS fixed-target experiments and the LHC should allow 2 × 1020 protons on target to be produced in five years of nominal operation.

The key experimental parameters in the phenomenology of the various hidden-sector models are relatively similar. This allows common optimisation of the design of the experimental facility and of the SHiP detector. Because the hidden particles are expected to be predominantly accessible through the decays of heavy hadrons and in photon interactions, the facility is designed to maximise their production and detector acceptance, while providing the cleanest possible environment. As a result, with 2 × 1020 protons on target, the expected yields of different hidden particles greatly exceed those of any other existing and planned facility in decays of both charm and beauty hadrons.

As shown in the figure (left), the next critical component of SHiP after the target is the muon shield, which deflects the high flux of muon background away from the detector. The detector for the hidden particles is designed to fully reconstruct the exclusive decays of hidden particles and to reject the background down to below 0.1 events in the sample of 2 × 1020 protons on target. The detector consists of a large magnetic spectrometer located downstream of a 50 m-long and 5 × 10 m-wide decay volume. To suppress the background from neutrinos interacting in the fiducial volume, the decay volume is maintained under a vacuum. The spectrometer is designed to accurately reconstruct the decay vertex, mass and impact parameter of the decaying particle at the target. A set of calorimeters followed by muon chambers provide identification of electrons, photons, muons and charged hadrons. A dedicated high-resolution timing detector measures the coincidence of the decay products, which allows the rejection of combinatorial backgrounds. The decay volume is surrounded by background taggers to detect neutrino and muon inelastic scattering in the surrounding structures, which may produce long-lived SM V0 particles, such as KL, etc. The experimental facility is also ideally suited for studying interactions of tau neutrinos. The facility will therefore host a tau-neutrino detector largely based on the Opera concept, upstream of the hidden-particle decay volume (CERN Courier November 2015 p24).

Global milestones and next steps

The SHiP experiment aims to start data-taking in 2026, as soon as the SPS resumes operation after Long Shutdown 3 (LS3). The 10 years consist, globally, of three years for the comprehensive design phase and then, following approval, a bit less than five years of civil engineering, starting in 2021, in parallel with four years for detector production and staged installation of the experimental facility, and two years to finish the detector installation and commissioning.

The key milestones during the upcoming comprehensive design phase are aimed at further optimising the layout of the experimental facility and the geometry of the detectors. This involves a detailed study of the muon-shield magnets and the geometry of the decay volume. It also comprises revisiting the neutrino background in the fiducial volume, together with the background detectors, to decide on the required type of technology for evacuating the decay volume. Many of the milestones related to the experimental facility are of general interest beyond SHiP, such as possible improvements to the SPS extraction, and the design of the target and the target complex. SHiP has already benefitted from seven weeks of beam time in test beams at the PS and SPS in 2015, for studies related to the Technical Proposal (TP). A similar amount of beam time has been requested for 2016, to complement the comprehensive design studies.

The SHiP collaboration currently consists of almost 250 members from 47 institutes in 15 countries. In only two years, the collaboration has formed and taken the experiment from a rough idea in the Expression of Interest to an already mature design in the TP. The CERN task force, consisting of key experts from CERN’s different departments, which was launched by the CERN management in 2014 to investigate the implementation of the experimental facility, brought a fundamental contribution to the TP. The SHiP physics case was demonstrated to be very strong by a collaboration of more than 80 theorists in the SHiP Physics Proposal.

The intensity frontier greatly complements the search for new physics at the LHC. In accordance with the recommendations of the last update of the European Strategy for Particle Physics, a multi-range experimental programme is being actively developed all over the world. Major improvements and new results are expected during the next decade in neutrino and flavour physics, proton-decay experiments and measurements of the electric dipole moments. CERN will be well-positioned to make a unique contribution to exploration of the hidden-particle sector with the SHiP experiment at the SPS.

• For further reading, see cds.cern.ch/record/2007512.

The eye that looks at galaxies far, far away

Night is falling over Cerro Paranal, a 2600 m peak within the mountain range running along Chile’s Pacific coastline. As our eyes gradually become accustomed to total obscurity and we start to catch a glimpse of the profile of the domes on top of the Cerro, we are overwhelmed by the breathtaking view of the best starry sky we have ever seen. The centre of the Milky Way is hanging over our heads, together with the two Magellanic Clouds and the four stars of the Southern Cross. The galactic centre is so star-dense that it looks rather like a 3D object suspended in the sky.

Not a single artificial light source is polluting the site, which is literally in the middle of nowhere, because the closest inhabited area is about 130 km away. The air in the austral winter in the Atacama desert is cold, but there is almost no wind, and no noise can be heard as I walk in the shadow of four gigantic (30 m tall) metal domes housing the four 8.2 m-diameter fixed unit telescopes (UTs) and four 1.8 m-diameter movable auxiliary telescopes (ATs), that make up the Very Large Telescope (VLT). Yet dozens of astronomers are working not far away, in a building right below the platform on top of the Cerro, overlooking the almost permanent cloud blanket over the Pacific Ocean.

As we enter the control room, I immediately feel a sense of déjà vu: a dozen busy and mostly young astronomers are drinking coffee, eating crisps and talking in at least three different languages, grouped around five islands of computer terminals.

Welcome to the nerve centre of the most complex and advanced optical telescope in the world. From here, all of the instrumentation is remotely controlled through some 100 computers connected to the telescopes by bunches of optical fibres. Four islands are devoted to the operation of all of the components of the VLT telescopes, from their domes to the mirrors and the imaging detectors, and the fifth is entirely devoted to the controls of interferometry.

 

Highly specialised ESO astronomers take their night shifts in this room 300 nights per year, on average. Most observations are done in service mode (60–70% of the total time), with ESO staff doing observations for other astronomers within international projects that have gone through an evaluation process and have been approved. The service mode guarantees full flexibility to reschedule observations and match them with the most suitable atmospheric conditions. The rest of the time is “visitor mode”, with the astronomer in charge of the project leading the observations, which is particularly useful whenever any real-time decision is needed.

The shift leader tonight is an Italian from Padova. He swaps from one screen to the next, trying to ignore the television crew’s microphones and cameras, while giving verbal instructions to a young Australian student. He is activating one of the VLT’s adaptive-optics systems, hundreds of small pistons positioned under the mirrors to change their curvature up to thousands of times per second, to counteract any distortion caused by atmospheric turbulence. “Thanks to adaptive optics, the images obtained with the VLT are as sharp as if we were in space,” he explains briefly, before leaning back on one of the terminals.

Complex machinery

Adaptive optics is not the only astronomers’ dream come true at the VLT. The VLT’s four 8.2 m-diameter mirrors are the largest single-piece light-collecting surface in the world, and the best application of active optics – the trick ESO scientists use to correct for gravitationally induced deformations as the telescope changes its orientation and so maintain the optics of the vast surface. The telescope mirrors are controlled by an active support system powered by more than 250 computers, working in parallel and positioned locally in each structure, to apply the necessary force to the mirrors to maintain their alignment with one another. The correcting forces have a precision of 5 g and keep the mirror in the ideal position, changing it every 3 minutes with 10 nm precision. The forces are applied on the basis of the analysis of the image of a real star, taken during the observations, so that the telescope is self-adjusting. The weight of the whole structure is incredibly low for its size. The 8.2 m-diameter reflecting surface is only 17 cm thick, and the whole mirror weighs 22 tonnes; its supporting cell weighs only 10 tonnes. Another technological marvel is the secondary mirror, a single-piece lightweight hyperbolic mirror that can move in all directions along five degrees of freedom. With its 1.2 m diameter, it is the second largest object entirely made in beryllium, after the Space Shuttle doors.

But the secret of the VLT’s uniqueness lies in a tunnel under the platform. Optical interferometry is the winning idea that enables the VLT to achieve yet unsurpassed ultra-high image resolution, by combining the light collected by the main 8.2 m UTs and the 1.8 m ATs. The physics principle behind the idea stems from Young’s 19th century two-slit experiment, and was first applied to radio astronomy, where wavelengths are long. But in the wavelength domains of visible and infrared light, interferometry becomes a much greater challenge. It is interesting to note that the idea of using optical interferometry became a real option for the VLT at the ESO conference held at CERN in 1977 (cf Claus Madsen The Jewel on the Mountain Top Wiley-VCH).

With special permission from the director and taking advantage of a technical stop to install a new instrument, we are able to visit the interferometry instrumentation room and tunnel under the platform – a privilege granted to few. The final instrument that collects and analyses all of the light coming from the VLT telescopes, after more than 25 different reflections, is kept like a jewel in a glass box in the instrumentation room. Nobody can normally get this close to it, because even the turbulence generated by a human presence can disturb its high-precision work. Following the path of the light, we enter the interferometry tunnel. The dominant blue paint of the metal rails and the size of the tunnel trigger once again an inevitable sense of déjà vu. Three horizontal telescopes travel seamlessly on two sets of four 60 m-long rails – the “delay lines” where the different arrival times of photons on each of the telescopes is compensated for with ultra precision. These jewels of technology move continuously along the rails without electric contact, thanks to linear engines with coils interacting directly with the magnets in the engine; no cable is connected to the telescopes on the rails because the signals are transmitted by laser, and electricity is conveyed by the rails themselves to enable precision and smooth movement. The system is so precise that it can detect and automatically adapt to earthquakes, and measure the vibrations provoked in the mountain by the waves of the Pacific Ocean 12 km away. Nowhere else has interferometry reached such complexity and been pushed so far.

Delivering science at a high rate

The resolution obtained by the Very Large Telescope Interferometer (VLTI – the name given to the telescopes when they function in this mode) is equivalent to the resolution of a 100 m-diameter mirror. Moreover, the Auxiliary Telescopes are mounted on tracks, and can move over the entire telescope platform, enabling the VLTI to obtain an even better final resolution. The combined images of the 4+4 telescopes allow the same light collection capacity as a much larger individual mirror, therefore making the VLT the largest optical instrument in the world.

Up to 15% of refereed science papers based on ESO data are authored by researchers not involved in the original data generation

Another revolution introduced by the VLT has to do with e-science. The amount of data generated by the new high-capacity VLT science instruments drove the development of end-to-end models in astronomy, introducing electronic proposal submission and service observing with processed and raw science and engineering data fed back to everyone involved. The expansion of the data links in Latin America enabled the use of high-speed internet connections spanning continents, and ESO has been able to link its observatories to the data grid. “ESO practises an open-access policy (with regulated, but limited propriety rights for science proposers) and holds public-survey data as well. Indeed, it functions as a virtual observatory on its own,” says Claus Madsen, senior counsellor for international relationships at ESO. Currently, up to 15% of refereed science papers based on ESO data are authored by researchers not involved in the original data generation (e.g. as proposers), and an additional 10% of the papers are partly based on archival data. Thanks also to this open-access policy, the VLT has become the most productive ground-based facility for astronomy operating at visible wavelengths, with only the Hubble Space Telescope generating more scientific papers.

Watch the video at https://cds.cern.ch/record/2128425.

Data sonification enters the biomedical field

Résumé

La sonification des données fait son entrée dans le domaine biomédical

La musique et les sciences de la vie ont beaucoup d’affinités : les deux disciplines font intervenir les concepts de cycles, de périodicité, de fluctuations, de transition et même, curieusement, d’harmonie. En utilisant la technique de la sonification, les scientifiques sont capables de percevoir et de quantifier la coordination des mouvements du corps humain, ce qui permet d’améliorer notre connaissance et notre compréhension du contrôle moteur en tant que système dynamique auto-organisé passant par des états stables et instables en fonction de changements dans les contraintes s’exerçant au niveau de l’organisme, des tâches et de l’environnement.

Resonances, periodicity, patterns and spectra are well-known notions that play crucial roles in particle physics, and that have always been at the junction between sound/music analysis and scientific exploration. Detecting the shape of a particular energy spectrum, studying the stability of a particle beam in a synchrotron, and separating signals from a noisy background are just a few examples where the connection with sound can be very strong, all sharing the same concepts of oscillations, cycles and frequency.

In 1619, Johannes Kepler published his Harmonices Mundi (the “harmonies of the world”), a monumental treatise linking music, geometry and astronomy. It was one of the first times that music, an artistic form, was presented as a global language able to describe relations between time, speed, repetitions and cycles.

The research we are conducting is based on the same ideas and principles: music is a structured language that enables us to examine and communicate periodicity, fluctuations, patterns and relations. Almost every notion in life sciences is linked with the idea of cycles, periodicity, fluctuations and transitions. These properties are naturally related to musical concepts such as pitch, timbre and modulation. In particular, vibrations and oscillations play a crucial role, both in life sciences and in music. Take, for example, the regulation of glucose in the body. Insulin is produced from the pancreas, creating a periodic oscillation in blood insulin that is thought to stop the down-regulation of insulin receptors in target cells. Indeed, these oscillations in the metabolic process are so key that constant inputs of insulin can jeopardise the system.

Oscillations are also the most crucial concept in music. What we call “sound” is the perceived result of regular mechanical vibrations happening at characteristic frequencies (between 20 and 20,000 times per second). Our ears are naturally trained to recognise the shape of these oscillations, their stability or variability, the way they combine and their interactions. Concepts such as pitch, timbre, harmony, consonance and dissonance, so familiar to musicians, all have a formal description and characterisation that can be expressed in terms of oscillations and vibrations.

Many human movements are cyclic in nature. An important example is gait – the manner of walking or running. If we track the position of any point on the body in time, for example the shoulder or the knee, we would see it describing a regular, cyclic movement. If the gait is stable, as in walking at a constant speed, the frequency associated would be regular, with small variations due to the inherent variability of the system. By measuring, for example, the vertical displacement of the centre of each joint while walking or running, we would have a series of one-dimensional oscillating waveforms. The collection of these waveforms provides a representation of co-ordinated movement of the body. Studying their properties, such as phase relations, frequencies and amplitudes, then provides a way to investigate the order parameters that define modes of co-ordination.

Previous methods of examining the relation between components of the body have included statistical techniques such as principal-component analysis, or analysis of coupled oscillators through vector coding or continuous relative phase. However, a problem is that data are lost using statistical techniques, and small variations due to the inherent variability of the system are ignored. Conversely, a coupled oscillator can cope only with two components contributing to the co-ordination.

Sonograms to study body movements

Our approach is based on the idea of analysing the waveforms and their relations by translating them into audible signals and using the natural capability of the ear to distinguish, characterise and analyse waveform shapes, amplitudes and relations. This process is called data sonification, and one of the main tools to investigate the structure of the sound is the sonogram (sometimes also called a spectrogram). A sonogram is a visual representation of how the spectrum of a certain sound signal changes with time, and we can use sonograms to examine the phase relations between a large collection of variables without having to reduce the data. Spectral analysis is a particularly relevant tool in many scientific disciplines, for example in high-energy physics, where the interest lies in energy spectra, pattern and anomaly detections, and phase transitions.

Using a sonogram to examine the movement of multiple markers on the body in the frequency domain, we can obtain an individual and situation-specific representation of co-ordination between the major limbs. Because anti-phase frequencies cancel, in-phase frequencies enhance each other, and a certain degree of variability in the phase of the oscillation results in a band of frequencies, we are able to represent the co-ordination within the system through the resulting spectrogram.

In our study, we can see exactly this. A participant ran on a treadmill that was accelerating between speeds of 0 and 18 km/h for two minutes. A motion-analysis system was used to collect 3D kinematic data from 24 markers placed bilaterally on the head, neck, shoulders, elbows, wrists, hand, pelvis, hips, knees, heels, ankles and toes of the participant (sampling frequency 100 Hz, trial length 120 s). Individual and combined sensor measurements were resampled to generate audible waveforms. Sonograms were then computed using moving-frequency Hanning analysis windows for all of the sound signals computed for each marker and combination of markers.

Sonification of individual and combined markers is shown above right. Sonification of an individual marker placed on the left knee (top left in the figure) shows the frequencies underpinning the marker movement on that particular joint-centre. By combining the markers, say of a whole limb such as the leg, we can examine the relations of single markers, through the cancellation and enhancement of frequencies involved. The result will show some spectral lines strengthening, others disappearing and others stretching to become bands (top right). The nature of the collective movements and oscillations that underpin the mechanics of an arm or a leg moving regularly during the gait can then be analysed through the sound generated by the superposition of the relative waveforms.

A particularly interesting case appears when we combine audifications of marker signals coming from opposing limbs, for example left leg/right arm or right leg/left arm. The sonogram bottom left in the figure is the representation of the frequency content of the oscillations related to the combined sensors on the left leg and the right arm (called additive synthesis, in audio engineering). If we compare the sonogram of the left leg alone (top right) and the combination with the opposing arm, we can see that some spectral lines disappear from the spectrum, because of the phase opposition between some of the markers, for example the left knee and the right elbow, the left ankle and the right hand.

The final result of this cancellation is a globally simpler dynamical system, described by a smaller number of frequencies. The frequencies themselves, their sharpness (variability) and the point of transition provide key information about the system. In addition, we are able to observe and hear the phase transition between the walking and running state, indicating that our technique is suitable for examining these order-parameter states. By examining movement in the frequency domain, we obtain an individual and situation-specific representation of co-ordination between the major limbs.

Sonification of movement as audio feedback

Sonification, as in the example above, does not require data reduction. It can provide us with unique ways of quantifying and perceiving co-ordination in human movement, contributing to our knowledge and understanding of motor control as a self-organised dynamical system that moves through stable and unstable states in response to changes in organismic, task and environmental constraints. For example, the specific measurement described above is a tool to increase our understanding of the adaptability of human motor control to something like a prosthetic limb. The application of this technique will aid diagnosis and tracking of pathological and perturbed gait, for example highlighting key changes in gait with ageing or leg surgeries.

In addition, we can also use sonification of movements as a novel form of audio feedback. Movement is key to healthy ageing and recovery from injuries or even pathologies. Physiotherapists and practitioners prescribe exercises that take the human body through certain movements, creating certain forces. The precise execution of these exercises is fundamental to the expected benefits, and while this is possible under the watchful eye of the physiotherapist, it can be difficult to achieve when alone at home.

In precisely executing exercises, there are three main challenges. First, there is the patient’s memory of what the correct movement or exercise should look like. Second, there is the ability of the patient to execute correctly the movement that they are required to do, working the right muscles to move the joints and limbs through the correct space, over the right amount of time or through an appropriate amount of force. Last, finding the motivation to perform sometimes painful, strenuous or boring exercises, sometimes many times a day, is a challenge.

Sonification can provide not only real-time audio feedback but also elements of feed-forward, which provides a quantitative reference for the correct execution of movements. This means that the patient has access to a map of the correct movements through real-time feedback, enabling them to perform correctly. And let’s not forget about motivation. Through sonification, in response to the movements, the patient can generate not only waveforms but also melodies and sounds that are pleasing.

Another possible application of generating melodies associated with movement is in the artistic domain. Accelerometers, vibration sensors and gyroscopes can turn gestures into melodic lines and harmonies. The demo organised during the public talk of the International Conference on Translational Research in Radio-Oncology – Physics for Health in Europe (ICTR-PHE), on 16 February in Geneva, was based on that principle. Using accelerometers connected to the arm of a flute player, we could generate melodies related to the movements naturally occurring when playing, in a sort of duet between the flute and the flutist. Art and science and music and movement seem to be linked in a natural but profound way by a multitude of different threads, and technology keeps providing the right tools to continue the investigation just as Kepler did four centuries ago.

Charting the future of CERN

Over the next five years, key events shaping the future of particle physics will unfold. We will have results from the second run of the LHC, and from other particle and astroparticle physics projects around the world. These will help us to chart the future scientific road map for our field. The international collaboration that is forming around the US neutrino programme will crystallise, bringing a new dimension to global collaboration in particle physics. And initiatives to host major high-energy colliders in Asia should become clear. All of this will play a role in shaping the next round of the European Strategy for Particle Physics, which will in turn shape the future of our field in Europe and at CERN.

CERN is first and foremost an accelerator laboratory. It is there that we have our greatest experience and concentration of expertise, and it is there that we have known our greatest successes. I believe that it is also there that CERN’s future lies. Whether or not new physics emerges at the LHC, and whether or not a new collider is built in Asia, CERN should aim to maintain its pre-eminence as an accelerator lab exploring fundamental physics.

CERN’s top priority for the next five years is ensuring a successful LHC Run 2, and securing the financial and technical development and readiness of the High-Luminosity LHC project. This does not mean that CERN should compromise its scientific diversity. Quite the opposite: our diversity underpins our strength. CERN’s programme today is vibrant, with unique facilities such as the Antiproton Decelerator and ISOLDE, and experiments studying topics ranging from kaons to axions. This is vital to our intellectual life, and it is a programme that will evolve and develop as physics needs dictate. Furthermore, with the new neutrino platform, CERN is contributing to projects hosted outside of Europe, notably the exciting neutrino programme underway at Fermilab.

If CERN is to retain its position as a focal point for accelerator-based physics in the decades to come, we must continue to play a leading role in global efforts to develop technologies to serve a range of possible physics scenarios. These include R&D on superconducting high-field magnets, high-gradient, high-efficiency accelerating structures, and novel acceleration technologies. In this context, AWAKE is a unique project using CERN’s high-energy, high-intensity proton beams to investigate the potential of proton-driven plasma wakefield acceleration for the very-long-term future. In parallel, CERN is playing a leading role in international design studies for future high-energy colliders that could succeed the LHC in the medium-to-long term. Circular options, with colliding electron–positron and proton–proton beams, are covered by the Future Circular Collider (FCC) study, while the Compact Linear Collider (CLIC) study offers potential technology for a linear electron–positron option reaching the multi-TeV range. To ensure a future programme that is compelling, and scientifically diverse, we are putting in place a study group that will investigate future opportunities other than high-energy colliders, making full use of the unique capabilities of CERN’s rich accelerator complex, while being complementary to other endeavours around the world. Along with the developments I mention above, these studies will also provide valuable input into the next update of the European Strategy, towards the end of this decade.

Global planning in particle physics has advanced greatly over recent years, with European, US and Japanese strategies broadly aligning, and the processes that drive them becoming ever more closely linked. For particle physics to secure its long-term future, we need to continue to promote strong worldwide collaborations, develop synergies, and bring new and emerging players, for example in Asia, into the fold.

Within that broad picture, CERN should steer a course towards a future based on accelerators. Any future accelerator facility will be an ambitious undertaking, but that should not deter us. We should not abandon our exploratory spirit just because the technical and financial challenges are intimidating. Instead, we should rise to the challenge, and develop the innovative technologies needed to make our projects technically and financially feasible.

Classical Dynamics: A Modern Perspective (2nd edition)

By E C G Sudarshan and N Mukunda
World Scientific

61tMwwJgiuL

More than 40 years since the appearance of the first edition, this book in now published in a revised version that is presented with the same passion and dedication as the original. The authors confess that they have always had an “affair of the heart” with classical dynamics, and this remains alive.

In the volume, classical dynamics is treated as a subject in its own right, as well as a research frontier. While presenting all of the essential principles, the authors demonstrate that a number of key results originally considered only in the context of quantum theory and particle physics have their foundations in classical dynamics.

Even if the text is based on what the authors define as “our understanding of quantum mechanics”, this new version builds on many suggestions coming from other physicists and continuous dialogue with students using the book as a reference.

Key Nuclear Reaction Experiments: Discoveries and Consequences

By Hans Paetz gen. Schieck
IOP Publishing

41SU4Q1i9mL._SX342_SY445_QL70_ML2_

Nuclear physics has seen enormous developments in the last century. The study of nuclear reactions has given fundamental insights into the nature of the forces that act within nuclei and on the structure of nuclides.

This book traces the history of the development of nuclear physics by reviewing key experiments that have shaped our understanding of the field. It is interesting to look back to the beginning and discover how crucial results were obtained by very simple means, and how the sophisticated and complex experiments of today came about. Experiments are described in detail and their outcomes are discussed. In some cases, original drawings are included, accompanied by new figures and plots when needed.

The theoretical background to the experiments is also given, but is kept concise. Nevertheless, the reader can refer to the references at the end of each chapter for a more in-depth treatment of individual subjects.

Besides drawing on the history of experiments and related discoveries, the book shows how misinterpretations and prejudices in some cases prevented or delayed fundamental breakthroughs.

Lectures on Quantum Mechanics (2nd edition)

By Steven Weinberg
Cambridge University Press
Also available at the CERN bookshop

41jmJ46W3mL._SX330_BO1,204,203,200_

After the great success of the 1st edition, the textbook on modern quantum mechanics by Nobel laurate Steven Weinberg is presented in a fully updated 2nd edition.

Thanks to his profound knowledge and expertise, the author explains in an exceptionally clear and rigorous way the topics of the subject that he considers to be the most important for a one-year graduate course. He begins with an historical review of quantum mechanics and an account of classic solutions to the Schrӧdinger equation, then goes on to develop quantum mechanics in a modern Hilbert-space approach.

Weinberg gives much greater emphasis than usual to principles of symmetry, and covers subjects that are often omitted in books on quantum mechanics, such as Bloch waves, time-reversal invariance, the Wigner–Eckart theorem, isotopic spin symmetry, Levinson’s theorem, etc. This 2nd edition includes major additions to existing chapters and has also been enriched by the addition of six new sections, covering topics such as the rigid rotor and quantum-key distribution.

The author takes care to explain the formalism, to be clear and coherent, and includes numerous examples from elementary particle physics. Problems are also included at the end of each chapter. Well-structured and easily readable, this book is bound to receive the same approval as the 1st edition did.

The Standard Model of Quantum Physics in Clifford Algebra

By C Daviau and J Bertrand
World Scientific

71IpoZbVPgL

In this book, the authors discuss the Standard Model, drawing upon Clifford algebra (a special case of geometric algebra) of space–time, following the work of Hestenes and other physicists that in the 1980s revisited Dirac theory using Clifford formalism.

After an introduction on the basics of Clifford algebra and the Dirac equation, the authors move on to place Dirac theory in a 3D framework, based on the Clifford algebra Cl3 of 3D space. They introduce their homogeneous nonlinear equation and explain why, in their opinion, it is better than the Dirac equation, which is its linear approximation.

Several consequences deriving from these novelties are then discussed extensively. In particular, a first attempt to reconcile the quantum world with inertia and gravitation is made.

The book also includes three appendices, reporting demonstrations and calculations related to the concepts explained in the text, and a rich bibliography.

Introduction to the Ads/CFT Correspondence

By Horaţiu Năstase
Cambridge University Press

9781107085855

The aim of this book is to give a pedagogical introduction to Anti-de Sitter/Conformal Field Theory, or AdS/CFT, which is the relation between quantum field theory with conformal invariance, living in our flat 4D space, and string theory, which is a quantum theory of gravity and other fields, living in the background solution of AdS5 × S5 (5D anti-de Sitter space multiplied by a five-sphere).

Assuming knowledge of only the basics of quantum field theory, the text provides readers with all of the concepts and tools needed to engage with AdS/CFT. In the first part, the author describes some fundamental concepts of general relativity, supersymmetry, supergravity, string theory, conformal field theory and D-branes. He has chosen not to overload the text with too many details about these fields, to keep the reader focused. The second section provides a clear and rigorous dissertation on AdS/CFT correspondence (in the context of its best understood example). Finally, in the third part, more specialised applications are discussed, such as QCD, quark–gluon plasma and condensed matter.

The book is self-contained, introducing all of the necessary basic concepts and most of the AdS/CFT methods and tools, but for an in-depth or exhaustive treatment, the reader is advised to refer to research articles. The many examples and exercises at the end of each chapter reveal the pedagogical vocation of the volume, nevertheless it will also be a useful reference for researchers in the fields of particle, nuclear and condensed-matter physics.

Silver Nanoparticles: From Silver Halide Photography to Plasmonics

By Tadaaki Tani
Oxford University Press

41h8+z6q4+L._SX342_SY445_QL70_ML2_

This book gives a comprehensive review of the synthesis, optical properties and applications of silver nanoparticles and nanomaterials.

Today, nanoscience, which is the study of extremely small things, plays a fundamental role in technology, and is connected to many scientific fields: physics, chemistry, biology, material science and engineering. Nanoparticles are of great scientific interest because they provide a bridge between bulk materials and atomic or molecular structures. We know that bulk material normally has constant physical properties regardless of its size, whereas size-dependent properties are often observed at the nanoscale.

Researchers are interested in nanoparticles of noble metals, including silver (Ag), because they show high potential for possible future plasmonic devices. On the other hand, nanoparticles of silver and silver halides (AgX) have been extensively studied in silver-halide photography.

The author offers an overview of both the properties of silver nanoparticles and related materials, and know-how in AgX photography. The first part (chapters 1–3) introduces the structure and preparation of nanoparticles of Ag and other noble metals for plasmonics, as well as those of Ag and AgX nanoparticles in photography. Then, in chapter 4, the relevant properties and performance of nanoparticles of Ag and related materials are presented, focusing in particular on light absorption and scattering. In the third part (chapters 5–7), the author discusses the applications of this research in catalysis, photovoltaic effects and plasmonics. New ideas in the field are also presented at the end.

Full of pictures and references, this text represents a synthesis of up-to-date knowledge in the field.

bright-rec iop pub iop-science physcis connect