Bluefors – leaderboard other pages

Topics

Holograms: A Cultural History

By Sean F Johnston
Oxford University Press

319G9S9F5SL._AC_SY780_

The book is a sort of biography of holograms, peculiar optical “objects” that have crossed the border of science to enter other cultural and anthropological fields, such as art, visual technology, pop culture, magic and illusion. No other visual experience is like interacting with holograms – they have the power to fascinate and amuse people. Not only physicists and engineers, but also artists, hippies, hobbyists and illusionists have played with, and dreamed about, holograms.

This volume can be considered as complementary to a previous book by the same author, a professor at the University of Glasgow, called Holographic Visions: A History of New Science. While the first book gave an account of the scientific concepts behind holography, and of its development as a research subject and engineering tool, the present text focuses on the impact that holography has had on society and consumers of such technology.

The author explores how holograms found a place in distinct cultural settings, moving from being expressions of modernity to countercultural art means, from encoding tools for security to vehicles for mystery.

Clearly written and full of interesting factual information, this book is addressed to historians and sociologists of modern science and technology, as well as to enthusiasts who are interested in understanding the journey of this fascinating optical medium.

Theoretical Foundations of Synchrotron and Storage Ring RF Systems

By Harald Klingbeil, Ulrich Laier, and Dieter Lens
Springer
Also available at the CERN bookshop

CCboo2_03_16

This book is one of few, if not the only one, dedicated to radiofrequency (RF) accelerator systems and longitudinal dynamics in synchrotrons, providing a self-contained and clear theoretical introduction to the subject. Some of these topics can be found in separate articles of specialised schools, but not in such a comprehensive form in a single source. The content of the book is based on a university course and it is addressed to graduate students who want to study accelerator physics and engineering.

After a short introduction on accelerators, the second chapter provides a concise but complete overview of the mathematical-physics tools that are required in the following chapters, such as Fourier analysis and Laplace transform. Ordinary differential equations and the basics of non-linear dynamics are presented with the notions of phase space, phase flow and velocity vector fields, leading naturally to the continuity equation and to the Liouville theorem. Hamiltonian systems are elegantly introduced, and the mathematical pendulum as well as a LC circuit are used as examples. This second chapter provides the necessary background for any engineer or physicist willing to enter the field of accelerator physics. The basic formulas and concepts of electromagnetism and special relativity are briefly recalled. The text is completed by a useful set of tables and diagrams in the appendix. An extensive set of references is given, although a non-negligible number are in German and might not be of help for the English-speaking reader. This feature is also found in other chapters.

In the third chapter, the longitudinal dynamics in synchrotrons is detailed. The basic equations and formulas describing synchrotron motion, bunch and bucket parameters are derived step-by-step, confirming the educational vocation of the book. The examples of a ramp and of multicavity operation are sketched out. I would have further developed the evolution of the RF parameters in a ramp using one of the GSI accelerators as a more concrete numerical example.

In the fourth chapter, the two most common types of RF cavities (ferrite-loaded and pillbox) are discussed in detail (in particular, the ferrite-loaded ones used in low- and medium-energy accelerators), providing detailed derivations of the various parameters and completing them with two examples referring to two specific applications.

The fifth chapter contains an interesting and thorough discussion on the theoretical description of beam manipulation in synchrotrons, with particular emphasis on the notion of adiabaticity, which is critical for emittance preservation in operation with high-brightness beams. This concept is normally dealt with in a qualitative way, while in this book a more solid background, derived from classical Hamiltonian mechanics, is provided. In the second part of the chapter, after an introduction to the description of a bunch by means of moments, including the concept of RMS emittance, a description of longitudinal bunch oscillations and their spectral representation is given, providing the basis for the study of longitudinal beam stability. This is not addressed in the book, and the notion of impedance is briefly introduced in the case of space charge, while some references covering these subjects are provided.

The last two chapters are devoted to the engineering aspects of RF accelerator systems: power amplifiers and closed-loop controls. The chapter on power amplifiers is mainly focused on the solutions of interest for low- and medium-energy synchrotrons, whereas high-frequency narrowband power amplifiers like klystrons are very briefly discussed. The chapter on low-level RF is rather dense but still clearly written, and is built around a specific example of an amplitude control loop. That eases the understanding of concepts and criteria underlying feedback stability and the impact of time delays and disturbances. The necessary mathematical tools are presented with a due level of detail, before delving into the stability criteria and into a discussion of the chosen example.

The volume is completed by a rich appendix summarising basic concepts and formulas required elsewhere in the book (e.g. some notions of transverse beam dynamics and the characterisation of fixed points) or working out in detail some examples of subjects treated in the main text. Some handy recalls of calculus and algebra are also provided.

This book undoubtedly fills a gap in the panorama of textbooks dedicated to accelerator physics. I would recommend it to any physicist or engineer entering the field. I enjoyed reading it as a comprehensive and clear introduction to some aspects of accelerator RF engineering, as well as to some of the theoretical foundations of accelerator physics and, in general, of classical mechanics.

50 Years of Quarks

By Harald Fritzsch and Murray Gell-Mann (eds)
World Scientific
Also available at the CERN bookshop

CCboo1_03_16

This book was written on the occasion of the golden anniversary of a truly remarkable year in fundamental particle physics: 1964 saw the discovery of CP violation in the decays of neutral kaons, of the Ω baryon (at the Brookhaven National Laboratory), and of cosmic microwave background radiation. It marked the invention of the Brout–Englert–Higgs mechanism, and the introduction of a theory of quarks as fundamental constituents of strongly interacting particles.

Harald Fritzsch and Murray Gell-Mann, the two fathers of quantum chromodynamics, look back at the events that led to the discovery, and eventually acceptance, of quarks as constituent particles. Why should we look back at the 1960s? Besides the fact that it is always worthwhile to reminisce about those times when theoretical physicists were truly eclectic, these stories are the testimony of a very active era, in which theoretical and experimental discoveries rapidly chased one another. What is truly remarkable is that, even in the absence of an underlying theory, piecing together sets of disparate experimental hints, the high-energy physics community was always able to provide a consistent description of the observed particles and their interactions. In fact, it was general principles such as causality, unitarity and Lorentz invariance that allowed far-reaching insights into analyticity, dispersion relations, the CPT theorem and the relation between spin and statistics to be obtained.

In this volume, Fritzsch and Gell-Mann present a collection of contributions written by renowned physicists (including S J Brodsky, J Ellis, H Fritzsch, S L Glashow, M Kobayashi, L B Okun, S L Wu, G Zweig and many others) that led to crucial developments in particle theory. The individual contributions in the book range from technical manuscripts, lecture notes and articles written 50 years ago, to personal, anecdotal and autobiographical accounts of endeavours in particle physics, emphasising how they interwove with the conception and eventually acceptance of the quark hypothesis. The book conveys the enthusiasm and motivation of the scientists involved in this journey, their triumph in cases of success, their amazement in cases of surprises or difficulties, and their disappointment in cases of failures. One realises that while quantum chromodynamics seems a simple and natural theory today, not everything was as easy as it now looks, 50 years later. In fact, the paradoxical properties of quarks, imprisoned for life in hadrons, had no precedent in the history of physics.

The last 50 years has witnessed spectacular progress in the description of elementary constituents of matter and their fundamental interactions, with important discoveries that led to the establishment of the Standard Model of particle physics. This theory accurately describes all observable matter, namely quarks and leptons, and their interactions at colliders through the electromagnetic, weak and strong force. Yet many open questions remain that are beyond the reach of our current understanding of the laws of physics. Of central importance now is the understanding of the composition of our universe, the dark matter and dark energy, the hierarchy of masses and forces, and a consistent quantum framework of unification of all forces of nature, including gravity. The closing contributions of the book put this venture in the context of today’s high-energy physics programme, and make a connection to the most popular ideas in high-energy physics today, including supersymmetry, unification and string theory.

Open access ebooks

Open access (OA) publishing is proving to be a very successful publishing model in the scholarly scientific field: today, more than 10,000 journals are accessible in OA, according to the Directory of Open Access Journals. Building on this positive experience, ebooks are also becoming available under this free-access scheme.

The economic model is largely inspired by the well-established practice in scientific article publishing, and several publishers have expanded their catalogues to include OA books. Under an appropriate licensing system, the authors retain copyright but the content can be freely shared and reused with appropriate author credit.

The ebooks system, in addition to expanding the diffusion of knowledge, overcomes the production and distribution costs of paper books that result in high prices for titles often acquired only by libraries. OA ebooks are also the ideal outlet for the publication of conference proceedings, maximising their visibility, and with great benefits for library budgets.

Five key works written or edited by CERN authors are already profiting from the impact that comes from their free dissemination.

Three of them are already accessible online:

• Melting Hadrons, Boiling Quarks – From Hagedorn Temperature to Ultra-Relativistic Heavy-Ion Collisions at CERN: With a Tribute to Rolf Hagedorn by Johann Rafelski (ed), published by Springer (link.springer.com/book/10.1007%2F978-3-319-17545-4).

• 60 Years of CERN Experiments and Discoveries by Herwig Schopper and Luigi Di Lella (eds), published by World Scientific (dx.doi.org/10.1142/9441#t=toc).

• The High Luminosity Large Hadron Collider: The New Machine for Illuminating the Mysteries of Universe by Lucio Rossi and Oliver Brüning (eds), published by World Scientific (dx.doi.org/10.1142/9581#t=toc).

Two further OA titles will appear in 2016:

• The Standard Theory of Particle Physics: 60 Years of CERN by Luciano Maiani and Luigi Rolandi (eds), published by World Scientific.

• echnology Meets Research: 60 Years of Technological Achievements at CERN, Illustrated with Selected Highlights by Chris Fabjan, Thomas Taylor and Horst Wenninger (eds), published by World Scientific.

Members of the organising committee of a conference, looking for an OA outlet for the proceedings, and authors who are planning to publish a book, are invited to contact the CERN Library, so that the staff there can help them to negotiate conditions with potential publishers.

The LHC is restarting

The LHC, last among all of CERN’s accelerators, is resuming operation with beam while this issue goes to press. The year-end technical stop (YETS) started on 14 December 2015. During the 11 weeks of scheduled maintenance activities, several interventions have taken place in all of the accelerators and beamlines. They included the maintenance of the cryogenic system at several points; the replacement of 18 magnets in the Super Proton Synchrotron (SPS); an extensive campaign to identify and remove thousands of obsolete cables; the replacement of the LHC beam absorbers for injection (TDIs) that are used to absorb the SPS beam if a problem occurs, providing vital protection for the LHC; and 12 LHC collimators have been dismantled and reinstalled after modification of the vacuum chambers, which restricted their movement.

The YETS also gave the experiments the opportunity to carry out repairs and maintenance work in their detectors. In particular, this included fixing the ATLAS vacuum-chamber bellow and cleaning the cold box at CMS, which had caused problems for the experiment’s magnet during 2015.

Bringing beams back into the machine after a technical stop of a few weeks is no trivial matter. The Electrical Quality Assurance (ELQA) team needs to test the electrical circuits of the superconducting magnets, certifying their readiness for operation. After that, the powering tests can start, and this means about 7000 tests in 12 days – a critical task for all of the teams involved, which will rely on the availability of all of the sectors. About four weeks after the start of commissioning, the LHC is ready to receive first beams and for them to circulate for several hours in the machine (stable beams).

The goal of this second part of Run 2 is to reach 2700 bunches per beam at 6.5 TeV and with nominal 25 ns spacing. In 2015, the machine reached a record of 2244 bunches in each beam, just before the beginning of the YETS. In 2016, the focus of the operators will be on ensuring maximum availability of the machine. For this, pipe scrubbing will be performed several times to keep the electron cloud effects under control. Thanks to the experience acquired in 2015, the operators will be able to improve the injection process and to perform ramping and squeezing at the same time, therefore reducing the time needed between two successive injections.

In addition to several weeks of steady standard 13 TeV operation with 2700 bunches per beam and β* = 40 cm, the accelerator schedule for 2016 includes a high-β* (~ 2.5 km) running period for TOTEM/ALFA dedicated to the measurement of the elastic proton–proton scattering in the Coulomb–nuclear interference region. The schedule also includes one month of heavy-ion run. Although various configurations (Pb–Pb and p–Pb) are still under consideration, the period – November – has already been decided. As usual, the heavy-ion run will conclude the 2016 operation of the LHC, while the extended year-end technical stop (EYETS) will start in December and will last about five months, until April 2017. Several upgrades are already planned by the experiments during the EYETS, including installation of the new pixel system at CMS.

The goal for the second part of Run 2 is to reach 1.3 × 1034 cm–2 s–1 of luminosity, which with about 2700 bunches and 25 ns spacing is estimated to produce a pile-up of 40 events per bunch crossing. This should give an integrated luminosity of about 25 fb–1 in 2016, which should ensure a total of 100 fb–1 for Run 2 – planned to end in 2018.

Searches with boosted topologies at Run 2

The first LHC run was highlighted by the discovery of the long-awaited Higgs boson at a mass of about 125 GeV, but we have no clue why nature chose this mass. Supersymmetry explains this by postulating a partner particle to each of the Standard Model (SM) fermions and bosons, but these new particles have not yet been found. A complementary approach to address this issue is to widen the net and look for signatures beyond those expected from the SM.

Searches for new physics in Run 1 found no signals, and from these negative results we know that new particles may be heavy. For this reason, their decay products, such as top quarks, electroweak gauge bosons (W, Z) or Higgs bosons, may be very energetic and could be highly boosted. When such particles are produced with large momentum and decay into quark final states, the decay products often collimate into a small region of the detector. The collimated sprays of hadrons (jets) originating from the nearby quarks are therefore not reliably distinguished. Special techniques have been developed to reconstruct such boosted particles into jets with a wide opening angle, and to identify the cores associated with the quarks using soft-particle-removal procedures (grooming). ATLAS performed an extensive optimisation of top, W, Z and Higgs boson identification, exploiting a wide range of jet clustering and grooming algorithms as well as kinematic properties of jet substructure before the second LHC run. This led to a factor of two improvement in W/Z tagging, compared with the technique used previously in terms of background rejection for the same efficiency for W/Z boson transverse momenta around 300–500 GeV.

ATLAS exploited the optimised boson tagging in the search for heavy resonances decaying into a pair of two electroweak gauge bosons (WW, WZ, ZZ) or of a gauge boson and a Higgs boson (WH, ZH) at 13 TeV collisions. Events are categorised into different numbers of charged/neutral leptons, and all possible combinations are considered except for fully leptonic and fully hadronic WH or ZH decays. For the Higgs boson, only the dominant decay into b quarks is considered. Figure 1 shows the results of WZ searches with a 2015 data set corresponding to 3.2 fb–1, presented as the lower limits on the production cross-section times the branching fraction for a new massive gauge boson with certain mass. No evidence for new physics has been found with these preliminary searches.

The boosted techniques have evolved into a fundamental tool for beyond SM searches at high energy. ATLAS foresees that the search will be greatly enhanced by the techniques, and seeks opportunities to adapt them in uncharted territory for the upcoming LHC run.

Anisotropic flow in Run 2

Exploiting the data collected during November 2015 with Pb–Pb collisions at the record-breaking energy of √sNN = 5.02 TeV, ALICE measured for the first time the anisotropic flow of charged particles at this energy.

Relativistic heavy-ion collisions are the tool of choice to investigate quark–gluon plasma (QGP) – a state of matter where quarks and gluons move freely over distances that are large in comparison to the typical size of a hadron. Anisotropic flow, which measures the momentum anisotropy of final-state particles, is sensitive on the one hand to the initial density and to the initial geometry fluctuations of the overlap region, and on the other hand to the transport properties of the QGP. Flow is quantified by the Fourier coefficients, νn, of the azimuthal distribution of the final-state charge particles. The dominant flow coefficient, ν2, referred to as elliptic flow, is related to the initial geometric anisotropy. Higher coefficients, such as triangular flow (ν3) and quadrangular flow (ν4), can be related primarily to the response of the produced QGP to fluctuations of the initial energy density profile of the participating nucleons.

Figure 1 shows the centrality dependence of flow coefficients, both for 2.76 and 5.02 TeV Pb–Pb collisions. Compared with the lower-energy results, the anisotropic flows ν2, ν3 and ν4 increase at the newly measured energy by (3.0±0.6)%, (4.3±1.4)% and (10.2±3.8)%, respectively, in the centrality range 0–50%.

The transport properties of the created matter are investigated by comparing the experimental results with hydrodynamic model calculations, where the shear-viscosity to entropy density ratio, η/s, is the dominant parameter. Previous studies demonstrated that anisotropic flow measurements are best described by calculations using a value of η/s close to 1/4π, which corresponds to the lowest limits for a quantum fluid. It is observed in figure 1 that the magnitude and the increase of anisotropic flow measured at the higher energy remain compatible with hydrodynamic predictions, favouring a constant value for η/s going from √sNN = 2.76 to 5.02 TeV Pb–Pb collisions.

It is also observed that the results of the pT-differential flow are comparable for both energies. This observation indicates that the increase measured in the integrated flow (figure 1) reflects the increase of the mean transverse momentum. Further comparisons of differential-flow measurements and theoretical calculations will provide a unique opportunity to test the validity of the hydrodynamic picture, and the power to further discriminate between various possibilities for the temperature dependence of the shear-viscosity to entropy density ratio of the produced matter in heavy-ion collisions at highest energies.

CMS hunts for supersymmetry in uncharted territory

The CMS collaboration is continuing its hunt for signs of supersymmetry (SUSY), a popular extension to the Standard Model that could provide a weakly interacting massive-particle candidate for dark matter, if the lightest supersymmetric particle (LSP) is stable.

With the increase in the LHC centre-of-mass energy from 8 to 13 TeV, the production cross-section for hypothetical SUSY partners rises; the first searches to benefit are those looking for the strongly coupled SUSY partners of the gluon (gluino) and quarks (squarks) that had the most stringent mass limits from Run 1 of the LHC. By decaying to a stable LSP, which does not interact in the detector and instead escapes, SUSY particles can leave a characteristic experimental signature of a large imbalance in transverse momentum.

Searches for new physics based on final states with jets (a bundle of particles) and large transverse-momentum imbalance are sensitive to broad classes of new-physics models, including supersymmetry. CMS has searched for SUSY in this final state using a variable called the “stransverse mass”, MT2, to measure the transverse-momentum imbalance, which strongly suppresses fake contributions due to potential hadronic-jet mismeasurement. This allows us to control the background from copiously produced QCD multi-jet events. The remaining background comes from Standard Model processes such as W, Z and top-quark pair production with decays to neutrinos, which also produce a transverse-momentum imbalance. We estimate our backgrounds from orthogonal control samples in data targeted to each. To cover a wide variety of signatures, we categorise our signal events according to the number of jets, the number of jets arising from bottom quarks, the sum of the transverse momenta of hadronic jets (HT), and MT2. Some SUSY scenarios predict spectacular signatures, such as four top quarks and two LSPs, which would give large values for all of these quantities, while others with small mass splittings produce much softer signatures.

Unfortunately, we did not observe any evidence for SUSY in the 2015 data set. Instead, we are able to significantly extend the constraints on the masses of SUSY partners beyond those from the LHC Run 1. The gluino has the largest production cross-section and many potential decay modes. If the gluino decays to the LSP and a pair of quarks, we exclude gluino masses up to 1550–1750 GeV, depending on the quark flavour, extending our Run 1 limits by more than 300 GeV. We are also sensitive to squarks, with our constraints summarised in figure 1. We set limits on bottom-squark masses up to 880 GeV, top squarks up to 800 GeV, and light-flavour squarks up to 600–1260 GeV, depending on how many states are degenerate in mass.

Even though SUSY was not waiting for us around the corner at 13 TeV, we look forward to the 2016 run, where a large increase in luminosity gives us another chance at discovery.

LHCb awards physics prizes for its Kaggle competition

Machine learning, also known in physics circles as multivariate analysis, is used more and more in high-energy physics, most visibly in data analysis but also in other applications such as trigger and reconstruction. The community of machine-learning data scientists organises “Kaggle” competitions to solve difficult and interesting challenges in different fields.

With the aim being to develop interactions with the machine-learning community, LHCb organised such a competition, featuring the search for the lepton-flavour violating decay, τ→ μμμ. This decay is (almost) forbidden in the Standard Model, and therefore its observation would indicate a discovery of “new physics”, which is now the key goal of the LHC. This Kaggle challenge (https://www.kaggle.com/c/flavours-of-physics) was conceived by a group of scientists from CERN, the University of Warwick, the University of Zürich and the Yandex School of Data Analysis. It was financially supported by the Yandex Data Factory, Intel and the University of Zürich. The competition took place over three months between July and October 2015. More than 700 people competed to achieve the best signal-versus-background discrimination and to win the prize awarded to the first three ranked solutions, totalling $15,000.

This particular challenge, using both “real” and simulated LHCb data, has been recognised by the community as more complicated than usual challenges, and therefore a refreshing problem to try and solve. The winners of the competition were awarded their prizes in December at one of the main conferences of the machine-learning community – the Twenty-ninth Annual Conference on Neural Information Processing Systems (NIPS).

In addition to the prizes for the best-ranked solutions, another prize was foreseen for the solution that is the most interesting from a physics point of view. In the event, LHCb decided to award two of these physics prizes of $2000 each to Vincens Gaitan (a former member of the ALEPH collaboration at CERN’s Large Electron–Positron collider) and Alexander Rakhlin. Their solutions are innovative and particularly suitable for cases where the size of the samples used to train the multivariate operator is limited and when the training samples do not perfectly match the real data.

The two awardees collected their prize at a three-day workshop organised at the University of Zürich on 18–21 February, as a follow-up to the Kaggle challenge. This workshop brought together 55 people from the LHC and the machine-learning communities, and interesting ideas have been exchanged. The general conclusion from discussions at this event was that the exercise had been a very positive one, both for LHCb and those that entered the competition.

Another important step for the AWAKE experiment

By harnessing the power of wakefields generated by a proton beam in a plasma cell, the AWAKE experiment at CERN (CERN Courier November 2013 p17) aims to produce accelerator gradients that are hundreds of times higher than those achieved in current machines.

The experiment is being installed in the tunnel that was previously used by the CERN Neutrinos to Gran Sasso facility. In AWAKE, a beam of 400 GeV protons from the CERN Super Protron Synchrotron will travel through a plasma cell and will generate a wakefield that, in turn, will accelerate an externally injected electron beam. A laser will ionise the gas in the cell to become a plasma and seed the self-modulation instability that will trigger the wakefield. The project aims to prove that the plasma wakefield can be driven with protons and that its acceleration will be extremely powerful – hundreds of times more powerful than that achieved today – and eventually to provide a design for a plasma-based linear collider.

The AWAKE tunnel is progressively being filled with its vital components. In its final configuration, the facility will feature a clean room for the laser, a dedicated area for the electron source and two new tunnels for two new beamlines: one small tunnel to hold the laser beam, which ionises the plasma and seeds the wakefields, and a second, larger tunnel that will be home to the electron beamline – the “witness beam” accelerated by the plasma. At the beginning of February, the plasma cell was lowered into the tunnel and moved to its position at the end of the proton line. The cell is a 10 m-long component developed by the Max Planck Institute for Physics in Munich (Germany). A first prototype successfully completed commissioning tests in CERN’s North Area in the autumn of 2015. The prototype allowed the AWAKE collaboration to validate the uniformity of the plasma temperature in the cell.

AWAKE is a collaborative endeavour with institutes and organisations participating around the world. The synchronised proton, electron and laser beams provided by CERN are an integral part of the experiment. After installation of the plasma cell, the next step will be installation of the laser, the vacuum equipment and the diagnostic system for both laser and proton beams.

Beam commissioning for the proton beamline is scheduled to start this summer. The programme will continue with installation of the electron line, with the aim of starting acceleration tests at the end of 2017.

bright-rec iop pub iop-science physcis connect