Comsol -leaderboard other pages

Topics

Bridging the gap

If you live in a low- or middle-income country (LMIC), your chances of surviving cancer are significantly lower than if you live in a wealthier economy. That’s largely due to the availability of radiation therapy (see “The changing landscape of cancer therapy”). Between 2015 and 2035, the number of cancer diagnoses worldwide is expected to increase by 10 million, with around 65% of those cases in poorer economies. Approximately 12,600 new radiotherapy treatment machines and up to 130,000 trained oncologists, medical physicists and technicians will be needed to treat those patients.

Experts in accelerator design, medical physics and oncology met at CERN on 26–27 October 2017 to address the technical challenge of designing a robust linear accelerator (linac) for use in more challenging environments. Jointly organised by CERN, the International Cancer Expert Corps (ICEC) and the UK Science and Technology Facilities Council (STFC), the workshop was funded through the UK Global Challenges Research Fund, enabling participants from Botswana, Ghana, Jordan, Nigeria and Tanzania to share their local knowledge and perspectives. The event followed a successful inaugural workshop in November 2016, also held at CERN (CERN Courier March  2017 p31).

The goal is to develop a medical linear accelerator that provides state-of-the-art radiation therapy in situations where the power supply is unreliable, the climate harsh and/or communications poor. The immediate objective is to develop work plans involving Official Development Assistance (ODA) countries that link to the following technical areas (which correspond to technical sessions in the October workshop): RF power systems; durable and sustainable power supplies; beam production and control; safety and operability; and computing.

Participants agreed that improving the operation and reliability of selected components of medical linear accelerators is essential to deliver better linear accelerator and associated instrumentation in the next three to seven years. A frequent impediment to reliable delivery of radiotherapy in LMICs, and other underserved regions of the world, is the environment within which the sophisticated linear accelerator must function. Excessive ambient temperatures, inadequate cooling of machines and buildings, extensive dust in the dry season and the high humidity in some ODA countries are only a few of the environmental factors that can challenge both the robustness of treatment machines and the general infrastructure.

Simplicity of operation is another significant factor in using linear accelerators in clinics. Limiting factors to the development of radiotherapy in lower-resourced nations don’t just include the cost of equipment and infrastructure, but also a shortage of trained personnel to properly calibrate and maintain the equipment and to deliver high-quality treatment. On one hand, the radiation technologist should be able to set treatments up under the direction of the radiation oncologist and in accordance with the treatment plan. On the other hand, maintenance of the linear accelerators should also be as easy as possible – from remote upgrades and monitoring to anticipate failure of components. These centres, and their machines, should be able to provide treatment on a 24/7 basis if needed, and, at the same time, deliver exclusive first-class treatment consistent with that offered in richer countries. STFC will help to transform ideas and projects presented in the next workshop, scheduled for March 2018, into a comprehensive technology proposal for a novel linear accelerator. This will then be submitted to the Global Challenges Research Fund Foundation Awards 2018 call for further funding. This ambitious project aims to have facilities and staff available to treat patients in low- and middle-income countries within 10 years.

Networking against cancer

The inaugural meeting of the European Network for Light Ion Hadron Therapy (ENLIGHT) took place at CERN in February 2002, with the aim of co-ordinating European efforts in innovative cancer treatment strategies using radiation. Specialists from different disciplines, including radiation biology, oncology, physics and engineering, with experience and interest in particle therapy have nurtured the network ever since.

Today, ENLIGHT can count on the contribution of more than 700 members from all continents. Together, they identify and tackle the technical challenges related to the use of highly sophisticated machines, train young and specialist researchers, and seek funding to ensure the sustainability and effectiveness of the organisation.

Started with the support of the European Commission (EC), ENLIGHT has coordinated four other EC projects in particle therapy: ULICE, PARTNER, ENVISION and ENTERVISION. In the past 15 years, the network has evolved into an open, collaborative and multidisciplinary platform to establish priorities and assess the effectiveness of various treatment modalities. Initially based on the three technologies and innovation pillars – accelerators, detectors and computing – of high-energy physics, the ENLIGHT initiative has evolved into a global effort.

Training essential

ENLIGHT has witnessed a large increase in dedicated particle therapy centres, and innovative medical imaging techniques are starting to make their way into hospitals. Skilled experts for high-tech cancer treatment are, therefore, in high demand. Thanks to the large number of scientists involved and its wide reach, ENLIGHT has enormous potential to offer education and training and, since 2015, has included training sessions in its annual meetings.

Education and training, in addition to pitching for research funding, are the main thrusts of ENLIGHT’s activities today. A project within the CERN & Society Foundation has just been approved, opening a new chapter for ENLIGHT and its community. The benefits lie, not only in reinforcing the hadron therapy field with qualified multidisciplinary groups of experts, but especially in helping young scientists flourish in the future.

www.cern.ch/ENLIGHT.

Strategic step for medical impact

Innovative ideas and technologies from physics have contributed to great advances in medicine, in particular radiation-based medical diagnosis and treatment. Today, state-of-the-art techniques derived from particle physics research are routinely used in clinical practice and medical research centres: from technology for PET scanners and dedicated accelerators for cancer therapy (see The changing landscape of cancer therapy), to simulation and data analysis tools.

Transferring CERN’s know-how to other fields is an integral part of its mission. Over the past 60 years, CERN has developed widely recognised expertise and unique competencies in particle accelerators, detectors and computing. While CERN’s core mission is basic research in particle physics, these “tools of the trade” have found applications in a variety of fields and can have an impact far beyond their initial expectations. An excellent recent example is the completion of CERN MEDICIS, which uses a proton beam to produce radioisotopes for medical research (see “Isotopes for precision medicine”).

Knowledge transfer (KT) for the benefit of medical applications has become an established part of CERN’s programme, formalised within the KT group. CERN has further initiated numerous international and multidisciplinary collaborations, partially or entirely devoted to technologies with applications in the medical field, some of which have been funded by the European Commission (EC). Until recently, the transfer of knowledge and technology from physics to medicine at CERN has essentially been driven by enthusiastic individuals on an ad hoc basis. In light of significant growth in medical applications-related activities, in 2017 CERN published a formal medical applications strategy (approved by the Council in June).

Its aims are to ensure that medical applications-related knowledge transfer activities are carried out without affecting CERN’s core mission of fundamental research, are relevant to the medical community and delivered within a sustainable funding model.

The focus is on R&D projects using technologies and infrastructures that are uniquely available at CERN, seeking to minimise any duplication of efforts taking place in Member States and associate Member States. The most promising CERN technologies and infrastructure that are relevant to the medical domain shall be identified – and the results matched with the requirements of the medical research communities, in particular in CERN’s Member States and associate Member States. Projects shall then be identified, taking into account such things as: maximising the impact of CERN’s engagement; complementarities with work at other laboratories; and the existence of sufficient external funding and resources.

CERN’s medical applications-related activities are co-ordinated by the CERN KT medical applications section, which also negotiates the necessary agreements with project partners. A new KT thematic forum, meanwhile, brings together CERN and Member State representatives to exchange information and ideas about medical applications (see “Faces and Places”). The CERN Medical Applications Steering Committee (CMASC) selects, prioritises, approves and coordinates all proposed medical applications-related projects. The committee receives input from the Medical Applications Project Forum (MAPF), the CERN Medical Applications Advisory Committee (CMAAC) and various KT bodies.

Although CERN can provide a limited amount of seed funding for medical applications projects, external stakeholders must provide the funding needed to deliver their project. Additional funding may be obtained through the EC Framework Programmes, and the CERN & Society Foundation is another potential source.

The transfer of know-how and technologies from CERN to the medical community represents one of the natural vehicles for CERN to disseminate the results of its work to society as widely as possible. The publication of a formal strategy document represents an important evolution of CERN’s program and highlights its commitment to maximise the societal impact of its research and to transfer CERN’s know-how and technology to its Member States and associate Member States.

The Physical World: An Inspirational Tour of Fundamental Physics

By Nicholas Manton and Nicholas Mee
Oxford University Press

CCboo3_10_17

Ranging from classical to quantum mechanics, from nuclear to particle physics and cosmology, this book aims to provide an overview of various branches of physics in both a comprehensive and concise fashion. As the authors state, their objective is to offer an inspirational tour of fundamental physics that is accessible to readers with a high-school background in physics and mathematics, and to motivate them to delve deeper into the topics covered.

Key equations are presented and their solutions derived, ensuring that each step is clear. Emphasis is also placed on the use of variational principles in physics.

After introducing some basic ideas and tools in the first chapter, the book presents Newtonian dynamics and the application of Newton’s law of gravitation to the motion of bodies in the solar system. Chapter 3 deals with the electromagnetic field and Maxwell’s equations. From classical physics, the authors jump to Einstein’s revolutionary theory of special relativity and the concept of space–time. Chapters 5 and 6 are devoted to curved space, general relativity and its consequences, including the existence of black holes. The other revolutionary idea of the 20th century, quantum mechanics, is discussed in chapters 7 and 8, while chapter 9 applies this theory to the structure and properties of materials, and explains the fundamental principles of chemistry and solid-state physics. Chapter 10 covers thermodynamics, built on the concepts of temperature and entropy, and gives special attention to the analysis of black-body radiation. After an overview of nuclear physics (chapter 11), chapter 12 presents particle physics, including a short description of quantum field theory, the Standard Model with the Higgs mechanism and the recent discovery of its related boson. Chapters 13 and 14 are about astrophysics and cosmology, while the final chapter discusses some of the fundamental problems that remain open.

The Cosmic Cocktail: Three Parts Dark Matter

By Katherine Freese
Princeton University Press

Also available at the CERN bookshop

CCboo2_10_17

This book by Katherine Freese, now out in paperback, is aimed at non-professionals interested in dark matter. The hypothesis that the matter in galaxy clusters is dominated by a non-luminous component, and hence is dark, goes back to a paper published in 1933 by the Swiss astronomer Fritz Zwicky, who also coined the term “dark matter”. But it has only been during the last 20 years or so that we have realised that the matter in the universe is dominated by dark matter and that most of it is non-baryonic, i.e. not made of the stuff that makes up all the other matter we know.

The author explains the observational evidence for dark matter and its relevance for cosmology and particle physics, both in a formal scientific context and also based on her personal adventures as a researcher in this field. I especially enjoyed her detailed, well-informed discussion and evaluation of present dark-matter searches.

The book is structured in nine chapters. The first is a personal introduction, followed by a historical account of the growing evidence for dark matter. Chapter 3 discusses our present understanding of the expanding universe, explaining how much of what we know is due to the very accurate observations of the cosmic microwave background. This is followed by a chapter on Big Bang nucleosynthesis, describing how the first elements beyond hydrogen (deuterium, helium-3, lithium and especially helium-4) were formed in the early universe. In the fifth chapter, the plethora of dark-matter candidates – ranging from axions to WIMPS and primordial black holes – are presented. Chapter 6 is devoted to the LHC at CERN: its four experiments are briefly described and the discovery of the Higgs is recounted. Chapters 6 and 7 are at the heart of the author’s own research (the author is a dark-matter theorist and not heavily involved in any particular dark-matter experiments). They discuss the experiments that can be undertaken to detect dark matter, either directly or indirectly or via accelerator experiments. An insightful and impartial discussion of present experiments with tentative positive detections is presented in chapter 8. The final chapter is devoted to dark energy, responsible for the accelerated expansion of the universe. Is it a cosmological constant or vacuum energy with a value that is many orders of magnitude smaller than what we would expect from quantum field theory? Is it a dynamical field or does the beautiful theory of general relativity break down at very large distances?

Even though in some places inaccuracies have slipped in, most explanations are rigorous yet non-technical. In addition to the fascinating subject, the book contains a lot of interesting personal and historical remarks (many of them from the first- or second-hand experience of the author), which are presented in an enthusiastic and funny style. They are one of the characteristics that make this book not only an interesting source of information but also a very enjoyable read.

As a female scientist myself, I appreciated the way the author acknowledges the work of women in science. She presents a picture of a field of research that has been shaped by many brilliant female scientists, starting from Vera Rubin’s investigations of galaxy rotation curves and ending with Elena Aprile’s and Laura Baudis’ lead in the most advanced direct dark-matter searches. It seems to need a woman to do justice to our outstanding female colleagues.

The fact that less than three years after the first publication of the book some cosmological parameters have shifted and some information about recent experiments is already outdated only tells us that dark matter is a hot topic of very active research. I sincerely hope that the author’s gut feeling is correct and the discovery of dark matter is just around the corner.

The Photomultiplier Handbook

By A G Wright
Oxford University Press

CCboo4_10_17

This volume is a comprehensive handbook aimed primarily at those who use, design or build vacuum photomultipliers. Drawing on his 40 years of experience as a user and manufacturer, the author wrote it to fill perceived gaps in the existing literature.

Photomultiplier tubes (PMTs) are extremely sensitive light detectors, which multiply the current produced by incident photons by up to 100 million times. Since their invention in the 1930s they have seen huge developments that have increased their performance significantly. PMTs have been and still are extensively applied in physics experiments and their evolution has been shaped by the requirements of the scientific community.

The first group of chapters sets the scene, introducing light-detection techniques and discussing in detail photocathodes – important components of PMTs – and optical interfaces. Since light generation and detection are statistical processes, detectors providing electron multiplication are also considered statistical in their operation. As a consequence, a chapter is dedicated to some theory of statistical processes, which is important to choose, use or design PMTs. The second part of the book deals with all of the important parameters that determine the performance of a PMT, each analysed thoroughly: gain, noise, background, collection and counting efficiency, dynamic range and timing. The effects of environmental conditions on performance are also discussed. The last part is devoted to instrumentation, in particular voltage dividers and electronics for PMTs.

Each chapter concludes with a summary and a comprehensive set of references. Three appendices provide additional useful information.

The book could become a valuable reference for researchers and engineers, and for students working with light sensors and, in particular, photomultipliers.

The Lazy Universe: An Introduction to the Principle of Least Action

By Jennifer Coopersmith
Oxford University Press

CCboo1_10_17

With contagious enthusiasm and a sense of humour unusual in this kind of literature, this book by Jennifer Coopersmith deals with the principle of least action or, to be more rigorous, of stationary action. As the author states, this principle defines the tendency of any physical system to seek out the “flattest” region of “space” – with appropriate definitions of the concepts of flatness and space. This is certainly not among the best-known laws of nature, despite its ubiquity in physics and having survived the advent of several scientific revolutions, including special and general relativity and quantum mechanics. The author makes a convincing case for D’Alembert’s principle (as it is often called) as a more insightful and conceptually fertile basis to understand classical mechanics than Newton’s laws. As she points out, Newton and D’Alembert asked very different questions, and in many cases variational mechanics, inspired by the latter, is more natural and insightful than working in Newton’s absolute space, but it can also feel like using a sledgehammer to crack a peanut.

The book starts with a general and very accessible introduction to the principle of least action. Then follows a long and interesting description of the developments that led to the principle as we know it today. The second half of the book delves into Lagrangian and Hamiltonian mechanics, while the final chapter illustrates the relevance of the principle for modern (non-classical) physics, although this theme is also touched upon several times in the preceding chapters.

An important caveat is that this is not a textbook: it should be seen as complementary to, rather than a replacement for, a standard introduction to the topic. For example, the Euler–Lagrange equation is presented but not derived and, in general, mathematical formulae are kept to a bare minimum in the main text. Coopersmith compensates for this with several thorough appendices, which range from classical textbook-like examples to original derivations. She makes a convincing critique of a famous argument by Landau and Lifshitz to demonstrate the dependence of kinetic energy on the square of the speed, and in one of the appendices she develops an interesting alternative explanation.

Although the author pays a lot of credit to The Variational Principles of Mechanics by Cornelius Lanczos (written in 1949 and re-edited in 1970), hers is a very different kind of book aimed at a different public. Moreover, the author has developed several original and insightful analogies. For example, she remarks upon how smartphones know their orientation: instead of measuring positions and angles with respect to external (absolute) space, three accelerometers in the phone measure tiny motions in three directions of the local gravity field. This is reminiscent of the methods of variational mechanics.

Notations are coherent throughout the book and clearly explained, and footnotes are used wisely. With an unusual convention that is never made explicit, the author graphically warns the reader when a footnote is witty or humorous, or potentially perceived as far-fetched, by putting the text in parenthesis.

My main criticism concerns the frequent references to distant chapters, which entangle the logical flow. This is a book made for re-reading and, as a result, it might be difficult to follow for readers with little previous knowledge of the topic. Moreover, I was rather baffled by the author’s confession (repeated twice) that she was unable to find a quote by Feynman that she is sure to have read in his Lectures. Nevertheless, these minor flaws do not diminish my general appreciation for Coopersmith’s very useful and well-written book.

The first part is excellent reading for anybody with an interest in the history and philosophy of science. I also recommend the book to students in physics and mathematics who are willing to dig deeper into this subject after taking classes in analytical mechanics, and I believe that it is accessible to any student in STEM disciplines. Practitioners in physics from any sub-discipline will enjoy a refresh and a different point of view that puts their tools of the trade in a broader context.

Exploring the physics case for a very-high-energy electron–proton collider

Rapid progress is being made in novel acceleration techniques. An example is the AWAKE experiment at CERN (CERN Courier January/February 2017 p8), which is currently in the middle of its first run demonstrating proton-driven plasma wakefield acceleration. This has inspired researchers to propose further applications of this novel acceleration scheme, among them a very-high-energy electron−proton (VHEeP) collider.

Simulations show that electrons can be accelerated up to energies in the TeV region over a length of only a kilometre using the AWAKE scheme. The VHEeP collider would use one of the LHC proton beams to drive a wakefield and accelerate electrons to an energy of 3 TeV over a distance less than 4 km, then collide the electron beam with the LHC’s other proton beam to yield electron−proton collisions at a centre-of-mass energy of 9 TeV – 30 times higher than the only other electron−proton collider, HERA at DESY. Other applications of the AWAKE scheme with electron beams up to 100 GeV are being considered as part of the Physics Beyond Colliders study at CERN (CERN Courier November 2016 p28).

Of course, itʼs very early days for AWAKE. Currently the scheme offers instantaneous luminosities for VHEeP of just 1028 – 1029 cm−2 s−1, mainly due to the need to refill the proton bunches in the LHC once they have been used as wakefield drivers. Various schemes are being considered to increase the luminosity, but for now the physics case of a VHEeP collider with very high energy but moderate luminosities is being considered. Motivated by these ideas, a workshop called Prospects for a very high energy ep and eA collider took place on 1–2 June at the Max Planck Institute for Physics in Munich to discuss the VHEeP physics case.

Electron−proton scattering can be characterised by the variables Q2 (the squared four-momentum of the exchanged boson) and x (the fraction of the proton’s momentum carried by the struck parton), the reaches of which are extended by a factor 1000 to high Q2 and to low x. The energy dependence of hadronic cross-sections at high energies, such as the total photon−proton cross-section, which has synergy with cosmic-ray physics, can be measured and QCD and the structure of matter better understood in a region where the effects are completely unknown. With values of x down to 10−8 expected for Q2 >1 GeV2, effects of saturation of the structure of the proton will be observed and searches at high Q2 for physics beyond the Standard Model will be possible, most significantly the increased sensitivity to the production of leptoquarks.

Pioneer of applied superconductivity: Henri Desportes 1933–2017

It is with great sadness that we announce the death of Henri Desportes, at the age of 84, on 24 September in the village of Gif sur Yvette, France. He was the head of the CEA Saclay department STCM until his retirement in the mid 1990s. Since the 1960s he was a pioneer of applied superconductivity and rapidly became an internationally recognised expert in the development of numerous accelerator and detector magnet systems for high-energy physics.

In particular, Desportes contributed to the creation of the first superconducting magnets  for many experimental programmes, including: polarised targets (HERA, installed at CERN and then in Protvino); the 15 foot bubble chamber at Argonne National Laboratory in the US; the magnet of the CERN hybrid spectrometer bubble chamber in 1972; the first thin-walled solenoid, CELLO, in 1978 at DESY; and the solenoid for the ALEPH experiment at LEP in 1986. His early participation in the genesis and design of the large magnets for the CMS and ATLAS detectors for the LHC should also not be forgotten.

Desportes supervised numerous work at Saclay on the development of innovative superconducting magnets with a wide range of scientific, technical and medical applications. He was the main initiator of new techniques using helium indirect cooling, the stabilisation of superconductor by aluminium co-extrusion and externally supported coils. Henri worked on all of these subjects with some of the great names in physics. It is partly thanks to him that Saclay has been involved in most of the magnets for large detectors built in Europe since the early 1970s. For this work he received a prestigious IEEE Council on Superconductivity Award in 2002.

We will remember his courtesy, his humour and his unfailing involvement in these flagship projects that have contributed greatly to physics experiments and to several fundamental discoveries.

Baby MIND takes first steps

In mid-October, a neutrino detector that was designed, built and tested at CERN was loaded onto four trucks to begin a month-long journey to Japan. Once safely installed at the J-PARC laboratory in Tokai, the “Baby MIND” detector will record muon neutrinos generated by beams from J-PARC and play an important role in understanding neutrino oscillations at the T2K experiment.

Weighing 75 tonnes, Baby MIND (Magnetised Iron Neutrino Detector) is bigger than its name suggests. It was initiated in 2015 as part of the CERN Neutrino Platform (CERN Courier July/August 2016 p21) and was originally conceived as a prototype for a 100 kt detector for a neutrino factory, specifically for muon-track reconstruction and charge-identification efficiency studies on a beamline at CERN (a task defined within the earlier AIDA project). Early in the design process, however, it was realised that Baby MIND was just the right size to be installed alongside the WAGASCI experiment located next to the near detectors for the T2K experiment, 280 m downstream from the proton target at J-PARC.

T2K studies the oscillation of muon (anti)neutrinos, especially their transformation into electron (anti)neutrinos, on their 295 km-long journey from J-PARC on the east coast of Japan to Kamioka on the other side of the island. The experiment discovered electron-neutrino appearance in a muon-neutrino beam in 2013 and earlier this year reported a two-sigma hint of CP violation by neutrinos, which will be explored further during the next eight years. Another major current target is to remove the ambiguity affecting the measurement of the neutrino mixing angle θ23.

Baby MIND will help in this regard by precisely tracking and identifying muons produced when muon neutrinos from the T2K beamline interact with the WAGASCI detector. This will allow the ratio of cross-sections in water and plastic scintillator (the active material in WAGASCI) to be determined, helping researchers understand  energy reconstruction biases that affect target nuclei-dependent neutrino fluxes and cross-sections. “Besides the water-to-scintillator ratio, the interest of the experiment is to measure a slightly higher-energy beam and compare the energy distribution (simply reconstructed from the muon angle and momentum, that Baby MIND measures) for the various off-axis positions relevant to the T2K and NOVA beams,” says Baby MIND spokesperson Alain Blondel.

Since its approval in December 2015, the Baby MIND collaboration – comprising CERN, the Institute for Nuclear Research of the Russian Academy of Sciences, and the universities of Geneva, Glasgow, Kyoto, Sofia, Tokyo, Uppsala, Valencia and Yokohama – has designed, prototyped, constructed and tested the Baby MIND apparatus, which includes custom designed magnet modules, electronics, scintillator sensors and support mechanics.

Significant departure

The magnet modules were the responsibility of CERN, and mark a significant departure from traditional magnetised-iron neutrino detectors, which have large coils threaded through the entire iron mass. Each of the 33 two-tonne Baby MIND iron plates is magnetised by its own aluminium coil, a feature imposed by access constraints in the shaft at J-PARC and resulting in a highly optimised magnetic field in the tracking volume. Between them, plastic scintillator slabs embedded with wavelength-shifting fibres transmit light produced by the interactions of ionising particles to silicon photomultipliers.

The fully assembled Baby MIND detector was qualified with cosmic rays prior to tests on a beamline at the experimental zone of CERN’s Proton Synchrotron in the East Area during the summer of this year, and analyses showed the detector to be working as expected. First physics data from Baby MIND are expected in 2018. “That new systems for the Baby MIND were designed, assembled and tested on a beamline in a relatively short period of time (around two years) is a great example of people coming together and optimising the detector by using the latest design tools and benefiting from the pool of experience and infrastructures available at CERN,” says Baby MIND technical co-ordinator Etam Noah.

bright-rec iop pub iop-science physcis connect