Bluefors – leaderboard other pages

Topics

Tutorials in Radiotherapy Physics: Advanced Topics with Problems and Solutions

By Patrick N McDermott
CRC Press

CCboo2_01_17

This book addresses five selected physics topics in modern cancer radiation therapy. Examining them in more detail than can be found in standard medical-physics textbooks, the author has also formulated and solved a large number of exercises that are provided at the end of each chapter, together with a detailed bibliography.

Despite its title, the book is not a substitute for comprehensive textbooks in medical-radiation physics, rather it complements them. It is therefore of interest to experienced medical physicists who would like to better understand the physics of their daily work, as well as to young researchers approaching this discipline for the first time, often following a PhD in particle physics.

The first section deals with the main tool of modern cancer radiation therapy: the electron linear accelerator (linac). Starting from the basics of electrodynamics, travelling- and standing-wave linear accelerators are discussed together with resonating cavities. Particular care is given to mathematical formulations and to the definition of symbols. This chapter could also appeal to accelerator physicists willing to know more about electron acceleration at energies of a few MeV.

Proton therapy, which is generally considered an advanced topic in medical radiation therapy, is approached in a somewhat easier way. Starting from an historical introduction, emphasis is given to accelerators and to dose-distribution systems, with a glimpse of future developments. It is a pity that carbon-ion therapy is not mentioned and that active dose-distribution systems are not discussed in more detail.

The two topics that follow address the daily work of the medical physicist. Dose-computation algorithms are treated following a careful mathematical formulation complemented by examples and references to practical cases. Deterministic radiation transport is introduced, starting from the basic quantities used in medical radiation physics. The transport and Fermi–Eyges equations are then derived and discussed.

The last theme, tumour control and normal tissue complications, is the most relevant for the patient. Is the therapy effective? What is the quality of life after treatment? The answers to these questions may be searched for using the bridge that connects physics to medicine. To accomplish this task, models are necessary. Starting from the concepts of probability and of dose-volume histograms, empirical and mechanistic models are presented together with the serial and parallel architecture of the organs in the human body.

The application of radiation physics to medicine is an expanding multidisciplinary field based on knowledge, tools and techniques derived from nuclear and particle physics. This book will therefore appeal not only to curious medical physicists and scientists active in the field, but also to physicists in general who – as the author comments – “like understanding”.

Particle Physics in the LHC Era

By G Barr, D Devenish, R Walczak and T Weidberg
Oxford University Press

CCboo1_01_17

This book’s aim, as stated in the introduction, is to provide a practical introduction to particle physics in the LHC era at the level of an advanced undergraduate or introductory graduate course. Indeed, in its almost 400 pages, it covers a wide range of topics, from instrumentation and detector technologies to some mathematical techniques and the traditional particle-physics topics that are usually included in similar textbooks. It hovers, by design, at the border between the established textbooks aimed at undergraduates and the more advanced graduate texts that often start with quantum field theory.

Following the introduction, the book commences with a three-chapter sequence with somewhat technical content. Chapter 2, dedicated to mathematical methods, covers discrete symmetries, angular momentum and rotations in space, Lorentz invariance and the calculation of phase-space factors, decay widths and cross-sections, and concludes with a brief review of group theory. Chapter 3, on accelerators, includes a concise yet clear description of the basic concepts and terminology often encountered by students starting to work on experiments but not readily available. The topics include synchronicity, beam optics, Q values and beam tunes, luminosity, and even some characteristics of past accelerators. Chapter 4 is on particle detectors. Beyond the standard topics expected in such an overview, e.g. the interaction of particles and radiation with matter, the chapter includes topics that are usually neglected, including short presentations on signal generation, triggering of experiments and the selection of a magnetic field. As would be expected, calorimetry is well covered, as are tracking detectors, to which an extensive description, including an introduction to solid-state detectors, is included. The topics and detector examples provided are too centred on the LHC and its experiments, though.

Chapter 5, on the static quark model, is the first “particle-physics-proper” section. It’s a clear and self-contained introduction to mesons and baryons, with a modern perspective. The authors have decided to include heavy quarks (with the exception of the top quark) and their mesons and baryons, and the result is a full overview for the reader. Finally, chapter 6 on relativistic quantum mechanics concludes what could be called the first part of the book on “concepts, tools and methods”. There is a modern angle in this chapter: as an example, Weyl spinors are introduced and used, along with the associated Lorentz transforms and spin matrices. This material is better absorbed by graduate students. The rest of the chapter covers the traditional Klein–Gordon and Dirac equations, and introduces the electromagnetic interaction. It concludes with a short introduction to gauge symmetry.

Chapters 7–10 constitute a second part that concentrates on particle physics. Chapter 7, on weak interactions, covers all of the material from the four-point Fermi interaction to the Standard Model (SM), although without symmetry breaking. The descriptions of V–A, parity violation and the weak interactions of quarks, the CKM matrix and hadron decays via the weak interaction are clear, as is the extended introduction of SU(2) × U(1) symmetry as the basis of the SM. Chapter 8, on experimental tests of electroweak theory, is one of the more modern presentations of the topics covered: it starts with neutrino interactions and charged and neutral currents, and moves to Z physics and then WW production at LEP. It includes some experimental aspects such as the use of resonant depolarisation for the precise determination of the LEP beam energy. Moving away from convention, the discovery of the W and Z bosons at the CERN SPS is left for after the LEP presentation. The chapter concludes with a brief presentation of the discovery of the top quark and some later results from the Tevatron.

Chapter 9, on dynamic quarks, breaks the flow slightly. It contains Rutherford scattering, the quark–parton model and neutrino interactions, and concludes its first part with electron–nucleon deep inelastic scattering. This is a departure from standard practice in most textbooks. The second part of the chapter is on the introduction of colour, QCD, parton distribution functions and hadron–hadron collisions, and the Drell–Yan process. The material, which is extensive but presented quite briefly, is more appropriate for undergraduates.

Chapters 10 (oscillations and CP violation in meson systems) and 11 (neutrino oscillations) are great introductions to physics mixing, both in the quark and the lepton sector. The discussion in chapter 10 is modern, with results from experiments at LEP, the B factories and hadron colliders. Chapter 11 has one of the best summaries on neutrino physics for this level: it starts with the first evidence of mixing in atmospheric neutrinos, and proceeds to laboratory experiments, and then the MSW effect, solar-neutrino oscillations and then three-flavour oscillations, concluding with the measurement of θ13. This chapter is a novel and useful addition to the textbook.

Chapter 12 is on the Higgs boson. It starts with a short introduction to spontaneous symmetry breaking and proceeds to a description of the discovery of the Higgs boson by the ATLAS and CMS experiments. The material, with the exception of a section on the statistical significance, which is too short and ill-placed to be useful, is at the right level for the advanced-undergraduate-to-graduate student audience.

The book concludes with chapter 13 on the LHC and BSM (physics Beyond the Standard Model). It has an interesting selection of topics, including expected ones like supersymmetry and some unexpected ones (for a textbook) like the search for new contact interactions and new resonances. The approach is quite experimental in that only the motivation for new phenomena is presented, and the theory is skipped. It is nevertheless a useful introduction to the subject, adequate for motivating students to explore further.

Overall, the book achieves its goal of bridging the gap between undergraduate and graduate textbooks. The descriptions of the various topics are mostly clear, although at times too short. In a formal course, the tutor would probably choose to cover the material in a slightly different mix to the order it is presented here, combining material from the first part (chapters 2–6) and the second part (mainly chapters 7–10). In summary, this is a welcome, useful and modern addition to the current list of textbooks in particle physics.

India to become associate Member State

On 21 November, CERN signed an agreement with Sekhar Basu, chairman of the Atomic Energy Commission (AEC) and secretary of the Department of Atomic Energy (DAE) of the government of India, to admit India as an associate Member State.

India has been a partner of CERN for more than 50 years, during which it has made substantial contributions to the construction of the LHC and to the ALICE and CMS experiments, as well as Tier-2 centres for the Worldwide LHC Computing Grid. A co-operation agreement was signed in 1991, but India’s relationship with CERN goes back much further, with Indian institutes having provided components for the LEP collider and one of its four detectors, L3, in addition to the WA93 and WA89 detectors. The success of the DAE–CERN partnership regarding the LHC has also led to co-operation on novel accelerator technologies through DAE’s participation in CERN’s Linac4, SPL and CTF3 projects. India also participates in the COMPASS, ISOLDE and nTOF experiments at CERN.

In recognition of these substantial contributions, India was granted observer status at CERN Council in 2002. When it enters into force, associate membership will allow India to take part in CERN Council meetings and its committees, and will make Indian scientists eligible for staff appointments. “Becoming associate member of CERN will enhance participation of young scientists and engineers in various CERN projects and bring back knowledge for deployment in the domestic programmes,” says Basu. “It will also provide opportunities to Indian industries to participate directly in CERN projects.”

Slovenia to become associate Member State in pre-stage to membership

CERN Council has voted unanimously to admit the Republic of Slovenia to associate membership in the pre-stage to CERN membership. Slovenia’s membership will facilitate, strengthen and broaden the participation and activities of Slovenian scientists, said Slovenian minister Maja Makovec Brenčič, and give Slovenian industry full access to CERN procurement orders. “Slovenia is also aware of the CERN offerings in the areas of education and public outreach, and we are therefore looking forward to become eligible for participation in CERN’s fellows, associate and student programmes.”

Slovenian physicists have participated in the LHC’s ATLAS experiment for the past 20 years, focusing on silicon tracking, protection devices and computing at the Slovenian Tier-2 data centre. However, Slovenian physicists contributed to CERN long before Slovenia became an independent state in 1991, participating in an experiment at LEAR and the DELPHI experiment at LEP. In 1991, CERN and the Executive Council of the Assembly of the Republic of Slovenia signed a co-operation agreement, and in 2009 Slovenia applied to become a Member State.

Following internal approval procedures, Slovenia will join Cyprus and Serbia as an associate Member State in the pre-stage to membership. At the earliest two years thereafter, Council will decide on the admission of Slovenia to full membership. “It is a great pleasure to welcome Slovenia into our ever-growing CERN family as an associate Member State in the pre-stage to membership,” says CERN Director-General Fabiola Gianotti.

Antihydrogen atoms show their colour

Following 20 years of research and development by the CERN antimatter community, the ALPHA collaboration has reported the first ever measurement of the optical spectrum of an antimatter atom. The result, published in Nature in December, involves technological developments that open a completely new era in high-precision antimatter research.

Comprising a single electron orbiting a single proton, hydrogen is the simplest and most well-understood atom, and has played a central role in fundamental physics for more than a century. Its spectrum is characterised by well-known spectral lines at certain wavelengths, corresponding to the emission of photons when electrons jump between different orbits. Measurements of the hydrogen spectrum agree with the predictions of quantum electrodynamics at the level of a few parts in 1015, and CPT invariance requires that antihydrogen has exactly the same spectrum.

The ALPHA team has now succeeded in observing the first spectral line in an atom of antihydrogen, made up of an antiproton and a positron. The measurement concerned the 1S–2S transition, which has a lifetime on the order of a tenth of a second and therefore leads to a narrow spectral line that is particularly suitable for precision measurements. The measurement was found to be in agreement with the hydrogen spectrum, and therefore consistent with CPT invariance, with a relative precision of around 2 × 10–10.

Comparing the spectra of hydrogen and antihydrogen was one of the main scientific motivations for CERN’s Antiproton Decelerator (AD), since it offers an extraordinary new tool to test whether matter behaves differently from antimatter and thus test the robustness of the Standard Model. The ALPHA collaboration, which expects to improve the precision of its measurements, generates roughly 25,000 antihydrogen atoms per trial by mixing antiprotons from the AD with positrons. Around 14 antiatoms per trial are trapped and interrogated by a laser at a precisely tuned frequency to measure their internal states.

Low-energy antihydrogen was first synthesised by the ATHENA collaboration in 2002, later repeated by the ATRAP, ALPHA and ASACUSA collaborations, and ALPHA trapped the first antihydrogen atoms in 2010. The new result, along with recent limits on the antiproton–electron mass ratio by the ASACUSA collaboration and antiproton charge-to-mass ratio by the BASE collaboration, demonstrates that tests of fundamental symmetries with antimatter at CERN are maturing rapidly.

AWAKE makes waves

In early December, the AWAKE collaboration made an important step towards a pioneering accelerator technology that would reduce the size and cost of particle accelerators. Having commissioned the facility with first beam in November, the team has now installed a plasma cell and observed a strong modulation of high-energy proton bunches as they pass through it. This signals the generation of very strong electric fields that could be used to accelerate electrons to high energies over short distances.

AWAKE (Advanced Proton Driven Plasma Wakefield Acceleration Experiment) is the first facility to investigate the use of plasma wakefields driven by proton beams. The experiment involves injecting a “drive” bunch of protons from CERN’s Super Proton Synchrotron (SPS) into a 10 m-long tube containing a plasma. The bunch then splits into a series of smaller bunches via a process called self-modulation, generating a strong wakefield as they move through the plasma. “Although plasma-wakefield technology has been explored for many years, AWAKE is the first experiment to use protons as a driver – which, given the high energy of the SPS, can drive wakefields over much longer distances compared with electron- or laser-based schemes,” says AWAKE spokesperson Allen Caldwell of the Max Planck Institute for Physics in Munich.

While it has long been known that plasmas may provide an alternative to traditional accelerating methods based on RF cavities, turning this concept into a practical device is a major challenge. The next step for the AWAKE collaboration is to inject a second beam of electrons, the “witness” beam, which is accelerated by the wakefield just as a surfer accelerates by riding a wave. “To have observed indications for the first time of proton-bunch self-modulation, after just a few days of tests, is an excellent achievement. It’s down to a very motivated and dedicated team,” says Edda Gschwendtner, CERN AWAKE project leader.

Quark–gluon plasma insights

Powerful supercomputer simulations of colliding atomic nuclei have provided new insights about quark–gluon plasma (QGP), a superhot fluid of de-confined partons produced in heavy-ion collisions at the LHC and at RHIC, Brookhaven National Laboratory. Shown in the image are the transverse (arrows) and longitudinal vorticity (contour) distributions of a strongly coupled quark–gluon plasma in the transverse plane at forward spatial rapidity. The coupling between spin and local vorticity shifts the energy level of fermions, leading to different phase-space distributions for fermions with different spin states and therefore spin polarisation along the direction of the local vorticity.

The international team responsible for the work, which involved weeks of processing on a GPU cluster, suggests that longitudinal spin correlations can be used to study the vortex structure of the expanding QGP in high-energy heavy-ion collisions. Different from global transverse polarisation, the longitudinal spin correlation does not decrease with beam energy or vanish in event averages. This provides a unique opportunity to study the local fluid vorticity of the QGP at LHC energies, concludes the team. “We can think about this as opening a completely new window of looking at quark–gluon plasmas, and how to study them,” says team member Xin-Nian Wang at the Central China Normal University and Lawrence Berkeley National Laboratory.

Proton–lead run tops record year of LHC operations

On 26 October, the LHC completed its 2016 proton–proton operations at a collision energy of 13 TeV, during which it exceeded the design value of the luminosity and broke many other records (CERN Courier December 2016 p5). As in most years, the machine was then reconfigured for a month-long heavy-ion run, devoted this year to colliding beams of protons (p) and lead nuclei (Pb). Following a feasibility test in 2012 and an initial month-long run in 2013, pPb collisions remain a novel mode of operation at the LHC. Despite this novelty, the LHC team was able to deliver enormous data sets to the experiments for the investigation of extreme nuclear matter during the 2016 run.

Asymmetric proton–nucleus collisions were originally seen as a means to disentangle cold from hot nuclear-matter effects studied in lead–lead collisions. Surprisingly, a more complex picture emerged following the pPb results of 2012 and 2013. For 2016, the LHC experiments requested a variety of apparently incompatible operating conditions, according to their diverse capabilities and physics programmes. Careful analysis of the beam physics and operational requirements led to an ambitious schedule comprising three different beam modes that could potentially fulfil all requests.

Following a technical stop, the first set-up for pPb collisions at a centre-of-mass energy for colliding nucleon pairs of 5.02 TeV started on 5 November and physics data-taking started on 10 November. This run was mainly dedicated to the LHC’s ALICE experiment  to increase an earlier collected sample of minimum-bias events. The other experiments also participated, with LHCb studying collisions between protons and a target of helium gas. As foreseen, the beam lifetimes were extremely long, allowing seven days of nearly uninterrupted running at a constant levelled luminosity of 0.8 × 1028 cm–2 s–1. A total of 660 million minimum-bias events were collected, increasing by a factor six the data set from 2013. One of the first fills also turned out to be the longest LHC fill ever, lasting almost 38 hours.

Just one day after the 5.02 TeV run ended, the second set-up involving new high-luminosity beam optics was complete and the LHC delivered pPb collisions at an energy of 8.16 TeV. This is the highest energy ever produced by a collider for such an asymmetric system, and included a short run for the LHCf experiment and also a third run in which the directions of the Pb and p beams were reversed. Thanks to the superb performance of the injectors and numerous improvements in the LHC, the luminosity soared to 9 × 1029 cm–2 s–1, which is 7.8 times the design value set some years ago. The luminosity could have been pushed even further had the intense flux of lead beam fragments from the collisions not risked quenching nearby magnets. On 4 December, the LHC was switched back to 5.02 TeV for a final 20 hours of pPb data taking, delivering a further 120 million minimum-bias events for ALICE.

That such a complex run could be implemented in such a short time was a triumph for the LHC and all those concerned with its design, construction and operation. Every one of the high-priority goals, plus some subsidiary ones, for ATLAS, CMS, ALICE and LHCb were comfortably exceeded. CMS recorded an integrated luminosity of nearly 200 nb–1 at 8.16 TeV, representing a six-fold increase of the sample collected from the first pPb run in 2013 at 5 TeV, and allowing the collaboration to investigate the behaviour of hard probes in high-multiplicity pPb collisions (see article on p11). ATLAS recorded a similar data set, while ALICE and LHCb each received totals well over 30 nb–1 at 8.16 TeV in the two beam directions.

ATLAS makes precision measurement of W mass

A precise measurement of the mass of the W boson, which was discovered at CERN in 1983, is vital because it is closely related to the masses of the top quark and the Higgs boson. Measuring the W mass tests this prediction and thus the self-consistency of the Standard Model (SM), since any deviation from theory would be a sign of new physics. The W mass was measured previously at CERN’s Large Electron–Positron (LEP) collider and Fermilab’s proton–antiproton collider, the Tevatron, yielding a world average of 80.385±0.015 GeV, which is consistent with the SM constraints of 80.358±0.008 GeV.

The ATLAS collaboration has now reported the first measurement of the W mass at the LHC, based on proton–proton collisions at a centre-of-mass energy of 7 TeV (corresponding to an integrated luminosity of 4.6 fb–1). The measured value, 80.370±0.019 GeV, matches the precision of the best single-experiment measurement of the W mass performed by the Tevatron’s CDF experiment, and is consistent with both the SM prediction and combined measurements (see figure).

Measuring the W mass is more challenging at the LHC compared with LEP and the Tevatron because there are a large number of interactions per beam crossing and significant contributions to W production from second-generation quarks (strange and charm). ATLAS measured the W mass by reconstructing the kinematic properties of leptonic decays, in which a W produces an electron or muon and a neutrino in the final state.

The analysis required a highly accurate calibration of the detector response, which was achieved via the large sample of Z-boson events and the precise knowledge of the Z mass. Accurate predictions of the W-boson production and decay properties are also crucial at a proton–proton collider. The enhanced amount of heavy-quark-initiated production and the ratio of valence and sea quarks in the proton affect the W boson’s transverse-momentum distribution and its polarisation, which makes the measurement sensitive to the parton distribution functions of the proton. To address these issues, ATLAS combined the most advanced theoretical predictions with experimental constraints from precise measurements of Z- and W-boson differential cross-sections and of Z-boson transverse momentum and polarisation.

Future analysis of larger data samples at the LHC would allow the reduction of the statistical uncertainty and of several experimental systematic uncertainties. Finally, a better knowledge of the parton distribution functions and improved QCD and electroweak predictions of W- and Z-boson production are crucial to further reduce the theoretical uncertainties.

Run 2 promises a harvest of beauty for LHCb

The first b-physics analysis using data from LHC Run 2, which began in 2015 with proton–proton collisions at an energy of 13 TeV, shows great promise for the physics programme of LHCb. During 2015 and 2016, the experiment collected a data sample corresponding to an integrated luminosity of about 2 fb–1. Although this value is smaller than the total integrated luminosity collected in the three years of Run 1 (3 fb–1), the significant increase of the LHC energy in Run 2 has almost doubled the production cross-section of beauty particles. Furthermore, the experiment has improved the performance of its trigger system and particle-identification capabilities. Once such an increase is taken into account, along with improvements in the trigger strategy and in the particle identification of the experiment, LHCb has already more than doubled the statistics of beauty particles on tape with respect to Run 1.

LHCb

The new analysis is based on 1 fb–1 of available data, aiming to measure the angle γ of the CKM unitarity triangle using B D0K*– decays. While B D0K decays have been extensively studied in the past, this is the first time the B D0K*– mode has been investigated. The analysis, first presented at CKM2016 (see “Triangulating in Mumbai” in Faces & Places), allows the LHCb collaboration to cross-check expectations for the increase of signal yields in Run 2 using real data. A significant increase, roughly corresponding to a factor three, is observed per unit of integrated luminosity. This demonstrates that the experiment has benefitted from the increase in b-production cross-section, but also that the trigger of the detector performs better than in Run 1. Although the statistical uncertainty on γ from this measurement alone is still large, the sensitivity will be improved by the addition of more data, as well as by the use of other D-meson decay modes. This bodes well for future measurements of γ to be performed in this and other decay modes with the full Run 2 data set.

Measurements of the angle γ are of great importance because it is the least well-known angle of the unitarity triangle. The latest combination from direct measurements with charged and neutral B-meson decays and a variety of D-meson final states, all performed with Run 1 data, yielded a central value of 72±7 degrees. LHCb’s ultimate aim, following detector upgrades relevant for LHC Run 3, is to determine γ with a precision below 1°, providing a powerful test of the Standard Model.

bright-rec iop pub iop-science physcis connect