While it is now generally accepted that dark matter makes up the majority of the mass in the universe, little is known about what it is. A favoured hypothesis among particle physicists has long been that dark matter is made of new elementary particles. However, experiments searching for such particles face a serious challenge: neither the particles’ mass nor the strength of their interaction with normal matter is known. So the experiments must cast an ever-widening net in search of these elusive particles.
At the end of February, the Cryogenic Dark Matter Search collaboration announced new results, obtained with the SuperCDMS detector. They expanded their search down to a previously untested dark-matter particle-mass range of 4–6 GeV/c2 and a dark-matter nucleon cross-section range of 1 × 10–40–1 × 10–41 cm2. Their exclusion results contradict recent hints of dark-matter detection by another experiment, CoGeNT, which uses particle detectors made of germanium – the same material used by SuperCDMS.
For their new results, CDMS employed a redesigned cryogenic detector known as iZIP that has ionization and phonon sensors interleaved on both sides of the germanium crystals. This substantially improves rejection of surface events from residual radioactivity, which have limited dark-matter sensitivity in previous searches. The collaboration operated these detectors 0.7 km underground in the Soudan mine in northern Minnesota, to shield them from cosmic-ray backgrounds.
There have been several recent hints for low-mass dark-matter particle detection, from previous data using silicon instead of germanium detectors in CDMS, and from three other experiments—DAMA, CoGeNT and CRESST—all finding their data compatible with the existence of dark-matter particles between 5 and 20 GeV/c2. But such light dark-matter particles are hard to pin down. The lower the mass of the dark-matter particles, the less energy they leave in detectors, and the more likely it is that background noise will drown out any signals.
The new CDMS iZIP detectors, with their improved background rejection, are continuing this search at Soudan, and hopefully soon in the lower background environment at SNOLAB. Confirmation of a signal of the direct detection of dark matter, and understanding of the interaction of dark matter with normal matter, is likely to require spotting these particles with different target nuclei in at least two different experiments.
The Standard Model predicts that the photons emitted in b → sγ transitions, which can only occur through loop-level processes, are predominantly left-handed. This means that the asymmetry between the amplitudes with right- and left-handed photons – photon polarization – is close to its minimum value of –1. This quantity has never been observed in a direct measurement and remains largely unexplored. As a consequence, there still exist several extensions of the Standard Model that predict a photon polarization significantly closer to zero but have not been ruled out by other measurements of b → sγ transitions.
The LHCb collaboration has exploited B+ →K+ π–π+γ decays, which are governed by the b → sγ transition, to probe the photon polarization. The “up–down” asymmetry between the number of photons detected above and below the plane defined by the momenta of the kaon and the two pions in their centre-of-mass frame is proportional to the photon polarization. So, a measurement of a nonzero asymmetry implies observation of photon polarization. The investigation is conceptually similar to the experiment that discovered parity violation in 1957 by measuring a nonzero up–down asymmetry for the electrons emitted in the weak decay of 60Co nuclei with respect to their spin direction.
Using the full data sample collected with the LHCb detector in 2011 and 2012, the collaboration has reconstructed almost 14,000 B+ →K+ π–π+γ events. Their angular distribution has been studied in four regions of the K+ π–π+ system’s mass, where different kaon resonances and their interferences can result in different sensitivities to the photon polarization. From determination of the up–down asymmetry, Aud, in each of these mass regions, LHCb finds a combined significance with respect to the null hypothesis of 5.2σ, and therefore observes photon polarization for the first time in such decays (LHCb collaboration 2014). This important result opens the door to the future determination of the value of the polarization of the photon, which will provide a strong new test of the validity of the Standard Model.
The fusion of two weak bosons is an important process that can be used to probe the electroweak sector of the Standard Model. Measurements of Higgs production via weak-boson fusion are crucial for precise extraction of the Higgs-boson couplings and have the potential to help pin down the charge conjugation and parity of the Higgs boson. A similar process, weak-boson scattering, is sensitive to alternative electroweak symmetry-breaking models and to anomalous weak-boson gauge couplings. These processes are extremely rare and the experimental observation of the production of heavy bosons via weak-boson fusion has become possible only recently with the large centre-of-mass energy and luminosity provided by the LHC. Extracting the signals from the huge backgrounds in the high pile-up conditions at the LHC is a major challenge.
The production of a Z boson via weak-boson fusion (figure 1) is an excellent benchmark for these rare processes. Weak-boson fusion has the characteristic signature of two low-angle jets, one on each side of the detector. These “tagging” jets typically have transverse momentum of the order of the W mass, because they arise from quarks in each proton recoiling against the two W bosons that fuse to produce the Z boson. Another interesting feature is the lack of colour flow between the tagging jets, which means there is little hadronic activity in that region. These features have been exploited by the ATLAS collaboration to extract the purely electroweak contribution to Z-plus-two-jet production, which includes the weak-boson fusion process.
The analysis was carried out using proton–proton collisions at a centre-of-mass energy of 8 TeV recorded by the ATLAS detector in 2012. Events containing a Z boson candidate in association with two high-transverse-momentum jets were selected in the e+e– and μ+μ– decay channels. The electroweak component was extracted by a fit to the dijet invariant mass spectrum in an electroweak-enhanced region that was defined, in part, by a veto on additional jet activity in the interval between the tagging jets. The background model was constrained using data in a signal-suppressed control region that was defined by reversing the jet-veto requirement. This data-driven constraint reduced the experimental and theoretical modelling uncertainties on the background model, allowing the electroweak signal to be extracted with a significance above the 5σ level. Figure 2 clearly demonstrates that the background-only model is inconsistent with the data in the electroweak-enhanced region. The cross-section measured for electroweak Z-plus-two-jet production, σ = 54.7±4.6 (stat.) +9.8–10.4 (syst.) ±1.5 (lumi.)fb, is in good agreement with the Standard Model prediction of 46.1±1.2 fb.
As the LHC experiments improve the precision of their measurements of Standard Model processes, the extent of possibilities for new physics open to exploration is becoming ever more apparent. Even within a constrained framework for new physics, such as the phenomenological minimal supersymmetric standard model (pMSSM), there is an impressive variety of final-state topologies and unique phenomena. For instance, in regions of the pMSSM where the chargino–neutralino mass difference is small, the chargino can become metastable and exhibit macroscopic lifetimes, potentially travelling anywhere between a few centimetres and many kilometres before it decays. An experiment like CMS can identify these heavy stable charged particles (HSCPs) through specialized techniques, such as patterns of anomalously high ionization in the inner tracker, as well as out-of-time signals in the muon detectors.
The CMS collaboration recently released a reinterpretation of a previously published search for HSCPs that used these techniques to constrain several broad classes of new physics models (CMS 2013a). There are two purposes for this reinterpretation. The first is to provide a simplified description of the acceptance and efficiency of the analysis as a function of a few key variables. This simplified “map” allows theorists and others interested to determine an approximate sensitivity of the CMS experiment to any model that produces HSCPs. This is an essential tool for the broader scientific community, because HSCPs are predicted in a large variety of models and it is important to understand if the gaps in their coverage are still present.
The second purpose is to provide a concrete example of a reinterpretation in terms of the pMSSM. In this analysis, CMS chose a limited subspace of the full pMSSM, requiring, among other things, that sparticle masses extend only up to about 3 TeV. The figure shows the number of points in this restricted pMSSM subspace that are excluded as a function of the average decay length, cτ, of the chargino. The red points are excluded by the HSCP interpretation described here (CMS 2013b). The blue points are excluded by another CMS search dedicated to “prompt” chargino production (CMS 2012a). The bottom panel shows the fraction of parameter points excluded by each of these two searches. Only a few parameter points, with chargino cτ >1 km, are still not excluded. This is because the theoretical cross-section for these parameter points is small – around 0.1 fb.
This analysis demonstrates the power of the CMS search for HSCPs to cover a broad range of models of new physics. By mapping the sensitivity of the analysis as a function of the HSCP kinematics and the detector geometry, it also makes the results from the search accessible for studies by the broader scientific community.
Although this analysis searches for metastable particles, another open possibility is the production of new, exotic particles that traverse a short distance – around 1 mm to 100 cm – before decaying to visible particles within the detector. CMS has also released results from two searches for such particles. One search looks for decays of these long-lived particles into two jets, and another into two oppositely charged leptons (CMS 2012b and 2012c). The results from these searches exclude production cross-sections for such particles as low as about 0.5 fb, depending on the lifetime and kinematics of the decay.
The field of laser-induced relativistic plasmas and, in particular, laser-driven particle acceleration, has undergone impressive progress in recent years. Despite many advances in understanding fundamental physical phenomena, one unexplored issue is how the particle spins are influenced by the huge magnetic fields inherently present in the plasmas.
Laser-induced generation of polarized-ion beams would without doubt be important in research at particle accelerators. In this context, 3He2+ ions have been discussed widely. They can serve as a substitute for polarized neutron beams, because in a 3He nucleus the two protons have opposite spin directions, so the spin of the nucleus is carried by the neutron. However, such beams are currently not available owing to a lack of corresponding ion sources. A promising approach for a laser-based ion source would be to use pre-polarized 3He gas as the target material. Polarization conservation of 3He ions in plasmas is also crucial for the feasibility of proposals aiming at an increase in efficiency of fusion reactors by using polarized fuel, because this efficiency depends strongly on the cross-section of the fusion reactions.
A group from Forschungszentrum Jülich (FZJ) and Heinrich-Heine University Düsseldorf has developed a method to measure the degree of polarization for laser-accelerated proton and ion beams. In a first experiment at the Arcturus Laser facility, protons of a few million electron volts – generated most easily by using thin foil targets – were used to measure the differential cross-section d2σ/dϑdφ of the Si(p, p´)Si reaction in a secondary scattering target. The result for the dependence on scattering angle is in excellent agreement with existing data, demonstrating the feasibility of a classical accelerator measurement with a laser-driven particle source.
The azimuthal-angle (φ) dependence of the scattering distributions allowed the degree of polarization of the laser-accelerated protons to be determined for the first time. As expected from computer simulations for the given target configuration, the data are consistent with an unpolarized beam. This “negative” result indicates that the particle spins are not affected by the strong magnetic fields and field gradients in the plasma. This is promising for future measurements using pre-polarized targets, which are underway at Arcturus.
The polarization measurements are also an important step towards JuSPARC, the Jülich Short Pulse Particle and Radiation Centre at FZJ. This proposed laser facility will provide not only polarized beams but also intense X-ray and thermal neutron pulses to users from different fields of fundamental and applied research.
On 11 February, the NOvA collaboration announced the detection of the first neutrinos in the long-baseline experiment’s far detector in northern Minnesota. The neutrino beam is generated at Fermilab and sent 800 km through the Earth’s surface to the far detector. Once completed, the near and far detectors will weigh 300 and 14,000 tonnes, respectively. Installation of the last module of the far detector is scheduled for early this spring and outfitting of both detectors with electronics should be completed in summer.
Knowledge of the electron mass has been improved by a factor of 13, thanks to a clever extension of previous Penning-trap experiments. A team from the Max-Planck-Institut für Kernphysik in Heidelberg, GSI and the ExtreMe Matter Institute in Darmstadt, and the Johannes Gutenberg-Universität in Mainz, used a Penning trap to measure the magnetic moment of an electron bound to a carbon nucleus in the hydrogen-like ion 12C5+. The cyclotron frequency of the combined system allowed precise determination of the magnetic field at the position of the electron, while the precession frequency allowed for measurement of the mass of the electron.
The result, in atomic-mass units, is 0.000548579909067(14)(9)(2) where the last error is theoretical. This new value for the electron’s mass value will allow comparison of the magnetic moment of the electron to theory – which is good to about 0.08 parts in 1012 – to better than one part in 1012.
A new analysis of observations of the galactic centre by the Fermi Gamma-ray Space Telescope strengthens the case for a signal from annihilating dark matter. The authors claim that the excess emission has a spectral shape, spatial extent and normalization in good agreement with predictions of simple models of dark-matter annihilation.
Although still elusive in particle-physics experiments, dark matter is a reality for astronomers. Its presence is implied by the fluctuations in the cosmic microwave background as measured by the Planck mission (CERN Courier May 2013 p12). It is essential for the formation of the first stars and galaxies, and it provides the additional gravitational pull to hold clusters of galaxies together. Nonetheless, dark matter is only detected indirectly, via its effect on ordinary matter, or at most through its gravitational lensing effect on background galaxies observed through a massive galaxy cluster (CERN Courier July/August 2013 p14). Weakly interacting massive particles (WIMPs) are a prime candidate for cold dark matter in the universe. With a mass above about 1 GeV and interacting only through the weak nuclear force and gravity, they can remain invisible because of their lack of electromagnetic interactions. However, the annihilation of WIMPs could potentially produce gamma rays, cosmic rays and neutrinos. Theoretical candidates for WIMPs include the lightest supersymmetric particle, neutralinos and sterile neutrinos.
As with all galaxies, the Milky Way is thought to be surrounded by a spherical halo of dark matter. The density gradient towards the galactic centre makes the latter the best place to search for a gamma-ray signal associated with dark-matter annihilation. The Fermi satellite is well suited for this search at energies between 100 MeV and 300 GeV (CERN Courier November 2008 p13). In the past few years, the analysis of Fermi’s observations of the central region of the Galaxy by different groups has detected a significant excess with a maximum emission at around 1–3 GeV.
A new analysis by a US team led by Tansu Daylan from Harvard University is now attracting attention. The team used a more restrictive selection of gamma rays, including only those with a small positional uncertainty. This allows the researchers to produce gamma-ray maps at higher resolution, enabling an easier separation of the putative spherically-symmetric dark-matter signal from the contribution of the Galaxy’s diffuse emission and the “bubbles” found by Fermi (CERN Courier January/February 2011 p11). The team finds an excess emission out to angles of around 10° from the centre of the Galaxy, with no significant deviation from spherical symmetry. The excess has a high significance and a best-fit spatial distribution following a generalized Navarro-Frenk-White halo profile, with an inner slope of γ = 1.26. Such a broad distribution disfavours the proposed alternative that this emission originates from a population of thousands of millisecond pulsars.
The derived spectrum associated with this excess is well fitted by dark-matter particles with mass 31–40 GeV annihilating to b quarks. The new study disfavours the previously considered 7–10 GeV mass window in which the dark matter annihilates significantly to τ leptons. The annihilation cross-section required to account for the signal is also found to be in good agreement with predictions for dark matter in the form of a simple thermal relic. While this analysis does not provide a discovery of dark-matter annihilation, it is nevertheless a compelling case for this process that will have to be confirmed by corroborating observations in dwarf galaxies around the Milky Way.
More than 350 world experts in accelerators and particle physics, including several laboratory directors, came together at the University of Geneva on 12–15 February to launch the Future Circular Collider (FCC) study, which will examine options for an energy-frontier collider based on a new 80–100-km-circumference tunnel infrastructure. The FCC study, which will be organized as a worldwide international collaboration, comprises a 100 TeV proton (and heavy-ion) collider at the energy frontier, a high-luminosity e+e– (H, Z, W, and tt) factory as a potential intermediate step, and an analysis of options for a hadron-lepton collider. The goal of the study is to deliver a conceptual design report (CDR) together with a cost review by 2018, in time for the next update of the European Strategy for Particle Physics. The CDR will integrate physics, detector, accelerator and infrastructure aspects.
The FCC design study responds to a high-priority request in the 2013 update of the European Strategy for Particle Physics (CERN Courier July/August 2013 p9) stating that “A conceptual design study of options for a future high-energy frontier circular collider at CERN for the post-LHC era shall be carried out”. February’s kick-off meeting was co-sponsored by the Extreme Beams work package 5 of the EuCARD-2 project, within the European Commission’s FP7 Capacities Programme. Participants came from all over the world, with particularly strong representation from China, Japan, Russia and the US, in addition to the many attendees from laboratories and universities across Europe. The goals of the meeting were to introduce the FCC study, to discuss its scope and organization, and to prepare and establish global collaborations.
In his opening address, CERN’s director-general, Rolf Heuer, presented an exciting perspective and explained the main motivations for the FCC, while also cautioning that it was too early to make any cost estimate. Nima Arkani-Hamed of the Institute for Advanced Study in Princeton, and recently appointed as the first director of the Centre for Future High Energy Physics at the Institute of High Energy Physics (IHEP) in Beijing, highlighted the compelling physics case for the 100 TeV hadron collider. Precision physics will be essential at both the lepton and hadron colliders, as Christoph Grojean from the Institut de Física d’Altes Energies in Barcelona underlined.
A similar study for a 50–70 km, double-purpose lepton and hadron collider is being pursued in China, with an attractive site proposal and ambitious schedule. In presenting the project, Yifang Wang, director of IHEP in Beijing, conceded that it would be a difficult project but it would also be very exciting. Even if implemented somewhere other than in China, it would still be beneficial to the field of particle physics in general and to the Chinese high-energy physics and scientific community in particular. To this end, IHEP fully supports a global effort. Fermilab’s associate director for accelerators, Stuart Henderson, also reported a broad acknowledgement in the US that any future collider would need to be a global enterprise, requiring financial and human resources from across the world. He stressed that the US community wishes to play a role in any future collider, while also mentioning several domestic “grass-roots” activities.
Frédérick Bordry, CERN’s director of accelerators and technology, presented the roadmap for CERN. Europe’s top priority for the next two decades is the exploitation of the LHC, with nominal parameters and a total integrated luminosity of about 300 fb–1 by 2023, and with the High-Luminosity LHC upgrade to reach 3000 fb–1 by 2035 (CERN Courier January/February 2014 p12 and p23). In parallel, as one of the next-highest-priority items, the FCC design study will be pursued along with CLIC as a potential post-LHC accelerator project at CERN. Michael Benedikt, the FCC study co-ordinator, reviewed the baseline parameters, design challenges and preparations for global collaboration, stressing that new partner institutes will be welcome throughout the duration of the study. Key technologies are high-field magnets for the hadron collider and an efficient high-power superconducting RF (SRF) system for the lepton collider. Possible R&D goals for the study include the development of short 16-T dipole models in all regions (America, Asia and Europe) by 2018 and, in parallel, demonstration of 20-T magnet technology based on the combination of high- and low-temperature superconductors as well as SRF developments, targeted at overall optimization of system efficiency and cost.
Philippe Lebrun, former head of CERN’s Accelerator Technology Department, pointed out that, although CERN’s experience in building machines of increasing size and performance can be applied to the study of 80–100 km circular accelerators in the Geneva basin, the step from the 27 km Large Electron–Positron collider and the LHC to the FCC represents major challenges. These will require inventive solutions in accelerator science and technology as well as in conventional facilities. Felix Amberg from Amberg Engineering – a company involved in the Gotthard Base Tunnel project – reported and analysed specific aspects of building long tunnels. His presentation suggested that tunnelling costs and risks can be predicted fairly reliably, provided that the project does not extend over too long a time interval and that the legal framework remains stable during the construction period.
Worldwide collaboration in all areas – physics, experiments and accelerators – was found to be essential to reach the level for a CDR by 2018
After two days of plenary sessions, which surveyed the scope, plan, international situation and design starting points of the FCC, seven parallel sessions gave space for feedback, additional presentations and lively international discussions. Worldwide collaboration in all areas – physics, experiments and accelerators – was found to be essential to reach the level for a CDR by 2018. Key R&D areas for the FCC, such as superconducting high-field magnets and SRF, are of general interest and relevant for many other applications. Significant R&D investments have been made over the past decade(s), for example in the framework of the LHC and High-Luminosity LHC. Further continuation will ensure efficient use of these investments. At the kick-off meeting a consensus emerged on the approach to form a global collaboration for this study, and many participants expressed a strong interest – both for themselves and their institutes.
Institutes worldwide are now invited to join the global FCC effort, and to submit non-committing written “expressions of interest” with regard to specific contributions by the end of May 2014.
ICTR-PHE : unir la physique, la médecine et la biologie
La conférence ICTR-PHE 2014 a réuni à Genève quelque 400 participants du monde entier venus discuter des dernières techniques de lutte contre le cancer. Chercheurs et praticiens de nombreuses disciplines ont passé en revue les dernières avancées de la recherche translationnelle, grâce à laquelle les innovations de la recherche fondamentale trouvent des applications dans le domaine de la santé (physique, biologie ou oncologie clinique). Pour tous les spécialistes concernés, la conférence ICTR-PHE est l’endroit idéal pour faire le point des travaux réalisés jusque-là et définir les prochaines étapes afin de maintenir la dynamique.
Physicists, biologists, physicians, chemists, nuclear-medicine experts, radio-oncologists, engineers and software developers – researchers and practitioners from many disciplines came to Geneva on 10–14 February for ICTR-PHE 2014, which brought together for the second time the International Conference on Translational Research in Radio-Oncology and Physics for Health in Europe. The joint conference aims to unite physics, biology and medicine for better healthcare, and the goal of this second meeting was to review the most recent advances in translational research – where developments in basic research are “translated” into means for improving health – in physics, biology and clinical oncology.
The conference featured the many advances that have occurred during the past two years since the first joint conference. The resolution and precision of medical imaging is continuing to grow with the use of combined modalities, such as positron-emission tomography (PET) with computed tomography (CT), and PET with magnetic resonance imaging (MRI) – an important technical breakthrough. Biologists and chemists are performing studies to develop new radiation carriers – including antibodies and nanoparticles – to target tumours. The Centro Nazionale di Adroterapia Oncologica (CNAO) in Italy has started hadron therapy with proton beams and carbon-ion beams, obtaining the necessary certification labels for both treatments. Another new centre, MedAustron in Austria, is being built and is reaching the commissioning phase. Moreover, while the use of proton therapy continues to grow around the world, the Japanese centres and the Heidelberg Ion-Beam Therapy Centre in Germany are using carbon-ion therapy on an increasing number of patients. For all of the experts involved in such a variety of different fields, the ICTR-PHE conference was the ideal place to take stock of the work done so far, and to define the next steps that the community should take to keep the momentum high.
Although the first patient was treated with protons 60 years ago in Berkeley, the field has not yet implemented all of the phases of the clinical trials required for evidence-based medicine and the national health systems. In particular, several experts discussed the need to perform randomized trials. This, of course, comes with unavoidable ethical issues and methodological concerns. The community is geographically scattered and several important factors – such as the integrated dose that should be delivered, the fractionation and the types of tumours to be treated – are still being studied. On one hand, it is a hard task for the various scientists to define common protocols to be followed to perform the trials. On the other hand, physicians and patients might be sceptical towards new therapies that are not yet felt to be tested extensively. Despite the fact that every year several thousand patients are diagnosed using radiopharmaceuticals and subsequently treated with hadron therapy, the use of particles is still often considered with scepticism.
The situation is made even more complex by the fact that the fight against cancer is taking on a more personalized approach. Although highly beneficial to patients, this makes it difficult for doctors to apply the same treatment plan to a large number of people. Cancer is not really a single disease. Its many facets require different therapies for different patients, depending on the specific type of malignant cell, the location of the tumour, its dimensions, etc. Several presentations at the conference focused on the important impact that such personalized treatment has in the disease’s prognosis.
In this respect, the challenge for today’s oncologists starts with high-quality imaging that allows them to define the active tumour volume as well as the possible metastasis in the body. Again, depending on the type of tumour, researchers can now select the best radiopharmaceutical that, once injected into the body and in conjunction with a detection modality such as PET, is able to identify the target cells precisely. Moreover, the same carrier molecules that are able to bring the radiating isotopes to the malignant cells and make them visible to the detecting instruments could be used with more powerful isotopes, to bring a lethal dose into the tumour volume directly. Some of the most recent studies involve the use of specific peptides associated with isotopes obtained at particle accelerators. Others involve innovative nanoparticles as vehicles to bring radiation into the target. Each single solution implies the use of specific isotopes. At CERN, the MEDICIS project aims to produce isotopes for medical research. Although the project has only recently entered the construction phase, the collaboration between the MEDICIS team and specialized teams of radiobiologists and chemists has already begun.
Imaging has reached spatial resolutions down to 2 mm. The combination of various imaging techniques, such as PET/CT or PET/MRI, allows oncologists to gather information not only about the geometry of a tumour but also about its functionality. Further improvements could come from both better hardware and more sophisticated software and algorithms for integration of the information. Significant improvement to the hardware could be introduced by the time-of-flight technique – well known to particle physicists for its use in many high-energy experiments.
The best treatment
Once the oncologists have acquired the information about the malignant cells and tumour volume, as well as other important data about the patient, they can define the best treatment for a specific case. Computer simulations made with the GEANT4 and FLUKA software suites are used to define the most suitable treatment planning. These codes are in continuous development and are able to deliver increasingly precise information about the dose distribution. In addition to new advances in computer simulations, the ICTR-PHE conference also featured a presentation about the first 3D mapping over a known distance of the dose distribution along the whole path of a 62 MeV proton beam. These studies are extremely useful in the determination of collateral damage, including possible secondary tumours caused by particle beams.
Unwanted damage to healthy tissues is a key point when it comes to comparing conventional photon-radiation therapy with hadron therapy. Thanks to the intensity modulation and volumetric arc techniques, and image-guided treatments, today’s conventional radiation therapy has reached levels of effectiveness that challenge hadron therapy. Nevertheless, because of the specific way they deliver their energy (the well known Bragg peak), hadrons can target tumours much more precisely. Therefore, hadron beams are potentially much less dangerous to nearby healthy tissues. However, their overall biological impact is still to be evaluated precisely and the cost of the infrastructures is significantly higher than for widely used conventional radiation. The debate remains open, and a final word will only come once the various teams involved have carried out the necessary clinical trials. The importance of sharing information and data among all active partners was highlighted throughout the conference.
In general, the results presented at the conference were, in many cases, very promising. Not only has the knowledge of cancer increased hugely during recent years, in particular at the molecular level, but also – and even more importantly – a different awareness is gaining momentum within the various communities. As one of the plenary speakers emphasized, the idea that one single oncologist can effectively fight cancer should be abandoned. Instead, the collaboration among chemists, biologists, engineers, physicists and physicians should surely improve the prognosis and the end result.
The beneficial impact of such collaboration was particularly evident when the speakers presented results from the combination of various techniques, including surgery and chemotherapy. This is because several factors play a role in the response of malignant cells to radiation: drugs, of course, and also the patient’s immunology, the hypoxia (oxygen deprivation) rate and the inner nature of the tumour cells. Recent studies have shown, for example, that malignant cells infected by the HPV virus have a better response to radiation, which translates into a better prognosis.
The role played by hypoxia and the various ways to overcome it were popular topics. A particularly interesting talk emphasized the need to go a step further and, having already acquired a deep knowledge of hypoxia in the malignant tissues, proceed to treat it with drugs before starting any further therapies. This is not yet the case in the current protocols, despite the many confirmations coming from research studies.
Indeed, the time needed for a new medical advance developed by scientists to reach the patient is a key issue. In this respect, the ICTR-PHE conference has a unique role. Medical doctors can learn about the latest radio-pharmaceuticals, the latest imaging instruments and the latest therapies that other scientists have worked on. At the same time, physicists, specialized industry, radiobiologists, etc, can hear from the medical field where they should concentrate their efforts for future research.
The impression was that the community is very willing to build a new collaboration model and that CERN could play an important role. The newly created CERN Office for Medical Applications is an example of the strength of the laboratory’s wish to contribute to the growth of the field. Medical doctors need cost-effective instruments that are easy to use and reliable over time. This presents a challenge for physicists, who will have to use the most advanced technologies to design new accelerator facilities to produce the hadron beams for patient treatment.
In addition to new accelerators, there is a plethora of opportunities for the physics field. These include the construction of a biomedical facility at CERN to provide particle beams of different types and energies for external users for radiobiology and detector development; the construction and testing of innovative detectors for beam control and medical imaging; the development of state-of-the-art instruments for accurate dosimetry; the MEDICIS facility for the production of rare radioisotopes; and a powerful computing grid for image treatment and storage.
As one of the speakers said, quoting the novelist William Gibson: “The future is here. It is just not evenly distributed yet.” This is the next challenge for the community of scientists who attended ICTR-PHE 2014 – to take all of these advances to the patients as quickly as possible.
Physics highlights
Even though the conference focused on translational research and medical applications of physics, it would have been impossible to ignore the discovery by the ATLAS and CMS experiments at the LHC of a Higgs boson – the particle linked to a mechanism that gives mass to many fundamental particles – and the subsequent award of the 2013 Nobel Prize in Physics to two of the theoreticians who proposed the mechanism. Fabiola Gianotti, former spokesperson of the ATLAS experiment at the LHC, opened the conference and captivated the audience with the tale of the many years of Higgs hunting by thousands of researchers across the world.
The role of physics and physicists was highlighted also by Ugo Amaldi in his public talk “Physics is beautiful and useful”. The father of the word “hadrontherapy” showed how, following the discovery of X-rays in 1895, fundamental physics, particle therapy and diagnostics became three intertwined yarns: the advances in one field have an impact on the other two. Amaldi concluded his much-appreciated talk by presenting an overview of possible future developments, including “Tulip” – a Turning Linac for Protontherapy – which is a new prototype that aims to supply protons with compact, less-expensive instrumentation.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.