Comsol -leaderboard other pages

Topics

Advanced radiation detectors in industry

 

The European Physical Society’s Technology and Innovation Group (EPS-TIG) was set up in 2011 to work at the boundary between basic and applied sciences, with annual workshops organized in collaboration with CERN as its main workhorse (CERN Courier April 2013 p31). The second workshop, organized in conjunction with the department of physics and astronomy and the “Fondazione Flaminia” of Bologna University, took place in Ravenna on 11–12 November 2013. The subject – advanced radiation detectors for industrial use – brought experts involved in the research and development of advanced sensors, together with representatives from related spin-off companies.

The first session, on technology-transfer topics, opened with a keynote speech by Karsten Buse, director of the Fraunhofer Institute for Physical Measurement Technique (IPM), Freiburg. In the spirit of Joseph von Fraunhofer (1787–1826) – a researcher, inventor and entrepreneur – the Fraunhofer Gesellschaft promotes innovation and applied research that is of direct use for industry. Outlining the IPM’s mission and the specific competences and services it provides, Buse presented an impressive overview of technology projects that have been initiated and developed or improved and supported by the institute. He also emphasized the need to build up and secure intellectual property, and explained contract matters. The success stories include the MP3 audio-compression algorithm, white LEDs to replace conventional light bulbs, and all-solid-state widely tunable lasers. Buse concluded by observing that bridging the gap between academia and industry requires some attention, but is less difficult than often thought and also highly rewarding. A lively discussion followed in the audience of students, researchers and partners from industry.

The second talk focused on knowledge transfer (KT) from the perspective of CERN’s KT Group. First, Giovanni Anelli described the KT activities based on CERN’s technology portfolio and on people – that is, students and fellows. In the second part, Manjit Dosanjh presented the organization’s successful and continued transfer to medical applications of advanced technologies in the fields of accelerators, detectors and informatics technologies. Catalysing and facilitating collaborations between medical doctors, physicists and engineers, CERN plays an important role in “physics for health” projects at the European level via conferences and networks such as ENLIGHT, set up to bring medical doctors and physics researchers together (CERN Courier December 2012 p19).

Andrea Vacchi of INFN/Trieste reviewed the INFN’s KT activities. He emphasized that awareness of the value of the technology assets developed inside INFN is growing. In the past, technology transfer between INFN and industry happened mostly through the involvement of suppliers in the development of technologies. In future, INFN will take more proactive measures to encourage technology transfer between INFN research institutions and industry.

From lab to industry

The first afternoon was rounded up by Colin Latimer of the University of Belfast and member of the EPS Executive Committee. He illustrated the varying timescales between invention and mass-application multi-billion-dollar markets, with a number of example technologies including optical fibres (1928), liquid-crystal displays (1936), magnetic-resonance imaging (MRI) scanners (1945) and lasers (1958), with high-temperature superconductors (1986) and graphene (2004) still waiting to make a major impact. Latimer went on to present results from the recent study commissioned by the EPS from the Centre for Economics and Business Research, which has shown the importance of physics to the European economy (EPS/Cebr 2013).

The second part of the workshop was devoted to sensors and innovation in instrumentation and industrial applications, starting with a series of talks that reviewed the latest developments. This was followed by presentations from industry on various sensor products, application markets and technological developments.

Erik Heijne, a pioneer of silicon and silicon-pixel detectors at CERN, started by discussing innovation in instrumentation through the use of microelectronics technology. Miniaturization to sub-micron silicon technologies allows many functions to be compacted into a small volume. This has led in turn to the integration of sensors and processing electronics in powerful devices, and has opened up new fields of applications (CERN Courier March 2014 p26). In high-energy particle physics, the new experiments at the LHC have been based on sophisticated chips that allow unprecedented event rates of up to 40 MHz. Some of the chips – or at least the underlying ideas – have found applications in materials analysis, medical imaging and other types of industrial equipment. The radiation imaging matrix, for example, based on silicon-pixel and integrated read-out chips, has many applications already.

Detector applications

Julia Jungmann of PSI emphasized the use of active pixel detectors for imaging in mass spectrometry in molecular pathology, in research done at the FOM Institute AMOLF in Amsterdam. The devices have promising features for fast, sensitive ion-imaging with time and space information from the same detector, high spatial resolution, direct imaging acquisition and highly parallel detection. The technique, which is based on the family of Medipix/Timepix devices, provides detailed information on molecular identity and localization – vital, for example in detecting the molecular basis of a pathology without the need to label bio-molecules. Applications include disease studies, drug-distribution studies and forensics. The wish list is now for chips with 100 ps time bins, a 1 ms measurement interval, multi-hit capabilities at the pixel level, higher read-out rates and high fluence tolerance.

In a similar vein, Alberto Del Guerra of the University of Pisa presented the technique of positron-emission tomography (PET) and its applications. Outlining the physics and technology of PET, he showed improved variants of PET systems and applications to molecular imaging, which also allow the visual representation, characterization and quantification of biological processes at the cellular and subcellular levels within living organisms. Clinical systems of hybrid PET and computerized tomography (CT) for application in oncology and neurology, human PET and micro-PET equipment, combined with small-animal CT, are available from industry, and today there are also systems where PET and magnetic resonance imaging (MRI) are combined. Such systems are being used in hadron therapy in Italy for monitoring purposes at the 62 MeV proton cyclotron of the CATANA facility in Catania, and at the proton and carbon synchrotron of the CNAO centre in Pavia. An optimized tri-modality imaging tool for schizophrenia is even being developed, combining PET with MRI and electroencephalography measurements. Del Guerra’s take-home message was that technology transfer in the medical field needs long-term investment – industry can withdraw halfway if a technology is not profitable (for example, Siemens in the case of proton therapy). In future, applications will be multimodal with PET combined with other imaging techniques (CT, MRI, optical projection tomography), for applications to specific organs such as the brain, breast, prostate and more.

The next topic related to recent developments in the silicon drift detector (SDD) and its applications. Chiara Guazzoni, of the Politecnico di Milano and INFN Milan, gave an excellent overview of SDDs, which were invented by Emilio Gatti and Pavel Rehak 30 years ago. These detectors are now widely used in X-ray spectroscopy and are commercially available. Conventional and non-conventional applications include the non-destructive analysis of cultural heritage and biomedical imaging based on X-ray fluorescence, proton-induced X-ray emission studies, gamma-ray imaging and spectroscopy, X-ray scatter imaging, etc. As Gatti and Rehak stated in their first patent, “additional objects and advantages of the invention will become apparent to those skilled in the art,” and Guazzoni hopes that the art will keep “drifting on” towards new horizons.

Moving on to presentations from industry and start-up companies, Jürgen Knobloch of KETEK GmbH in Munich presented new high-throughput, large-area SDDs, starting with a historical review of the work of Josef Kemmer, who in 1970 started to develop planar silicon technology for semiconductor detectors. Collaborating with Rehak and the Max-Planck Institute in Munich, Kemmer went on to produce the first SDDs with a homogeneous entrance window, with depleted field-effect transistor (DEPFET) and MOS-type DEPFET (DEPMOS) technologies. In 1989 he founded the start-up company KETEK, which is now the global commercial market leader in SSD technology. Knobloch presented the range of products from KETEK and concluded with a list of recommendations for better collaboration between research and industry. KETEK’s view on how science and industry can better collaborate includes: workshops of the kind organized by EPS-TIG; meetings between scientists and technology companies to set out practical needs and future requirements; involvement of technology-transfer offices to resolve intellectual-property issues; encouragement of industry to accept longer times for returns in investments; and the strengthening of synergies between basic research and industry R&D.

Knobloch’s colleague at KETEK, Werner Hartinger, then described new silicon photomultipliers (SiPMs) with high proton-detection efficiency, and listed the characteristics of a series of KETEK’s SiPM sensors, which also feature a huge gain (> 106) with low excess noise and a low temperature coefficient. KETEK has off-the-shelf SiPM devices and also customizes devices for CERN. The next steps will be continuous noise reduction (in both dark rate and cross-talk) by enhancing the KETEK “trench” technology, enhancement of the pulse shape and timing properties by optimizing parasitic elements and read-out, and the production of chip-size packages and arrays at the package level.

New start-ups

PIXIRAD, a new X-ray imaging system based on chromatic photon-counting technology, was presented by Ronaldo Bellazzini of PIXIRAD Imaging Counters srl – a recently constituted INFN spin-off company. The detector can deliver extremely clear and highly detailed X-ray images for medical, biological, industrial and scientific applications in the energy range 1–100 keV. Photon counting, colour mode and high spatial resolution lead to an optimal ratio of image quality to absorbed dose. Modules with units of 1, 2, 4 and 8 tiles have been built with almost zero dead space between the blocks. A complete X-ray camera based on the PIXIRAD-1 single-module assembly is available for customers in scientific and industrial markets for X-ray diffraction, micro-CT, etc. A dedicated machine to perform X-ray slot-scanning imaging has been designed and built and is currently under test. This system, which uses the PIXIRAD-8 module and is able to produce large-area images with fine position resolution, has been designed for digital mammography, which is one of the most demanding X-ray imaging applications.

CIVIDEC Instrumentation – another start-up company – was founded in 2009 by Erich Griesmayer. He presented several examples of applications of the products, which are based on diamond-detector technology. They have found use at the LHC and other accelerator beamlines as beam-loss and beam-position monitors for time measurements, high-radiation-level measurements, neutron time of flight, and as low-temperature detectors in superconducting quadrupoles. The company provides turn-key solutions that connect via the internet, supplying clients worldwide.

Nicola Tartoni, head of the detector group at the Diamond Light Source, outlined the layout of the facility and its diversified programmes. He presented an overview of the detector development and beamlines of this outstanding user facility in partnership with industry, with diverse R&D projects of increasing complexity.

Last, Carlos Granja, of the Institute of Experimental and Applied Physics (IEAP) at the Czech Technical University (CTU) in Prague, described the research carried out with the European Space Agency (ESA) demonstrating the impressive development in detection and particle tracking of individual radiation quanta in space. This has used the Timepix hybrid semiconductor pixel-detector developed by the Medipix collaboration at CERN. The Timepix-based space-qualified payload, produced by IEAP CTU in collaboration with the CSRC company of the Czech Republic, has been operating continuously on board ESA’s Proba-V satellite in low-Earth orbit at 820 km altitude, since being launched in May 2013. Highly miniaturized devices produced by IEAP CTU are also flying on board the International Space Station for the University of Houston and NASA for high-sensitivity quantum dosimetry of the space-station crew.

In other work, IEAP CTU has developed a micro-tracker particle telescope in which particle tracking and directional sensitivity are enhanced by the stacked layers of the Timepix device. For improved and wide-application radiation imaging, edgeless Timepix sensors developed at VTT and Advacam in Finland, with advanced read-out instrumentation and micrometre-precision tiling technology (available at IEAP CTU and the WIDEPIX spin-off company, of the Czech Republic), enable large sensitive areas up to 14 cm square to be covered by up to 100 Timepix sensors. This development allows the extension of high-resolution X-ray and neutron imaging at the micrometre level to a range of scientific and industrial applications.

• For more about the workshop, visit www.emrg.it/TIG_Workshop_2013/program.php?language=en. For the presentations, see http://indico.cern.ch/event/284070/.

The 1980s: spurring collaboration

The 1980s were characterized by two outstanding achievements that were to influence the long-term future of CERN. First came the discovery of the W and Z particles, the carriers of the weak force, produced in proton–antiproton collisions at the Super Proton Synchrotron (SPS) and detected by the UA1 and UA2 experiments. These were the first, now-typical collider experiments, covering the full solid angle and requiring large groups of collaborators from many countries. The production of a sufficient number of antiprotons and their handling in the SPS underlaid these successes, which were crowned by the Nobel Prize awarded to Carlo Rubbia and Simon van der Meer in 1984.

CCvie1_03_14

Then came the construction and commissioning of the Large Electron–Positron (LEP) collider. With its 27 km tunnel, it is still the largest collider of this kind ever built. Four experiments were approved – ALEPH, DELPHI, L3 and OPAL – representing again a new step in international co-operation. More than 2000 physicists and engineers from 12 member states and 22 non-member states participated in the experiments. Moreover, most of the funding of several hundred million Swiss francs had to come from outside the organization. CERN contributed only about 10% and had practically no reserves in case of financial overruns. Therefore the collaborations had to achieve a certain independence, and had to learn to accept common responsibilities. A new “sociology” for international scientific co-operation was born, which later became a model for the LHC experiments.

A result of the worldwide attraction of LEP was that from 1987 onwards, more US physicists worked at CERN than particle physicists from CERN member states at US laboratories. In Europe, two more states joined CERN: Spain, which had left CERN in 1968, came back in 1983, and Portugal joined in 1985. However, negotiations at the time with Israel and Turkey failed, for different reasons.

But the 1980s also saw “anti-growth”. Previously, CERN had received special allocations to the budget for each new project, leading to a peak around 1974 and declining afterwards. When LEP was proposed in 1981, the budget was 629 million Swiss francs. After long and painful discussions, Council approved a constant yearly budget of 617 million Swiss francs for the construction of LEP, under the condition that any increase – including automatic compensation for inflation – across the construction period of eight years was excluded. The unavoidable consequence of these thorny conditions was the termination of many non-LEP programmes (e.g. the Intersecting Storage Rings and the bubble-chamber programme) and a “stripped down” LEP project. The circumference of the tunnel had to be reduced, but was maintained at 27 km in view of a possible proton–proton collider in the same tunnel – which indeed proved to be a valuable asset.

A precondition to building LEP with decreasing resources was the unification of CERN. CERN II had been established in 1971 for construction of the SPS, with its own director-general, staff and management. From 1981, CERN was united under one director-general, but staff tended to adhere to their old groups, showing solidarity with their previous superiors and colleagues. However, for the construction of LEP, all of CERN’s resources had to be mobilized, and about 1000 staff were transferred to new assignments.

Another element of “anti-growth” had long-term consequences. Council was convinced that the scientific programme was first class, but had doubts about the efficiency of management. An evaluation committee was established to assess the human and material resources, with a view to reducing the CERN budget. In the end, the committee declined to consider a lower material budget because this would undoubtedly jeopardize the excellent scientific record of CERN. They proposed instead a reduction of staff from about 3500 to 2500, through an early retirement programme, and during the construction of the LHC this was even lowered to 2000. However, to cope with the increasing tasks and the rising number of outside users, many activities had to be outsourced, so considerable reduction of the budget was not achieved.

Yet despite these limiting conditions, LEP was built within the foreseen time and budget, thanks to the motivation and ingenuity of the CERN staff. First collisions were observed on 13 August 1989.

The theme of CERN’s 60th anniversary is “science for peace” – from its foundation, CERN had the task not only to promote science but also peace. This was emphasized at a ceremony for the 30th anniversary in 1984, by the American physicist and co-founder of CERN, Isidor Rabi: “I hope that the scientists of CERN will remember…[they are] as guardians of this flame of European unity so that Europe can help preserve the peace of the world.” Indeed during the 1980s, CERN continued to fulfil this obligation, with many examples such as co-operation with East European countries (in particular via JINR, Dubna) and with countries from the Far East (physicists from Mainland China and Taiwan were allowed to work together in the same experiment, L3, on LEP). Later, CERN became the cradle of SESAME, an international laboratory in the Middle East.

Unavoidably, CERN’s growth into a world laboratory is changing how it functions at all levels. However, we can be confident that it will perform its tasks in the future with the same enthusiasm, dedication and efficiency as in the past.

The Theory of the Quantum World: Proceedings of the 25th Solvay Conference on Physics

By David Gross, Marc Henneaux and Alexander Sevrin (eds.)
World Scientific
Hardback: £58
Paperback: £32
E-book: £24

CCboo3_02_14

Since 1911, the Solvay Conferences have helped shape modern physics. The 25th edition in October 2011, chaired by David Gross, continued this tradition, while also celebrating the conferences’ first centennial. The development and applications of quantum mechanics have been the main threads throughout the series, and the 25th Solvay Conference gathered leading figures working on a variety of problems in which quantum-mechanical effects play a central role.

In his opening address, Gross emphasized the success of quantum mechanics: “It works, it makes sense, and it is hard to modify.” In the century since the first Solvay Conference, the worry expressed by H A Lorentz in his opening address in 1911 – “we have reached an impasse; the old theories have been shown to be powerless to pierce the darkness surrounding us on all sides” – has been resolved. Physics is not in crisis today, but as Gross says there is “confusion at the frontiers of knowledge”. The 25th conference therefore addressed some of the most pressing open questions in the field of physics. As Gross admits, the participants were “unlikely to come to a resolution during this meeting….[but] in any case it should be lots of fun”.

The proceedings contain the rapporteur talks and, in the Solvay tradition, they also include the prepared comments to these talks. The discussions among the participants – some involving dramatically divergent points of view – have been carefully edited and are reproduced in full.

The reports cover the seven sessions: “History and reflections” (John L Heilbron and Murray Gell-Mann); “Foundations of quantum mechanics and quantum computation” (Anthony Leggett and John Preskill); “Control of quantum systems” (Ignacio Cirac and Steven Girvin); “Quantum condensed matter” (Subir Sachdev); “Particles and fields” (Frank Wilczek); and “Quantum gravity and string theory” (Juan Maldacena and Alan Guth). The proceedings end – as did the conference – with a general discussion attempting to arrive at a synthesis, where the reader can judge if it fulfilled the prediction by Gross and was indeed “lots of fun”.

Mathematics of Quantization and Quantum Fields

By Jan Dereziński and Christian Gérard
Cambridge University Press
Hardback: £90 $140
Also available as an e-book

612UMVJgRYL

Unifying a range of topics currently scattered throughout the literature, this book offers a unique review of mathematical aspects of quantization and quantum field theory. The authors present both basic and more advanced topics in a mathematically consistent way, focusing on canonical commutation and anti-commutation relations. They begin with a discussion of the mathematical structures underlying free bosonic or fermionic fields, such as tensors, algebras, Fock spaces, and CCR and CAR representations. Applications of these topics to physical problems are discussed in later chapters.

Three-Particle Physics and Dispersion Relation Theory

By A V Anisovich, V V Anisovich, M A Matveev, V A Nikonov, J Nyiri and A V Sarantsev
World Scientific
Hardback: £65
E-book: £49

61Ry82dMrpL

The necessity of describing three-nucleon and three-quark systems has led to continuing interest in the problem of three particles. The question of including relativistic effects appeared together with the consideration of the decay amplitude in the dispersion technique. The relativistic dispersion description of amplitudes always takes into account processes that are connected to the reaction in question by the unitarity condition or by virtual transitions. In the case of three-particle processes they are, as a rule, those where other many-particle states and resonances are produced. The description of these interconnected reactions and ways of handling them is the main subject of the book.

Science, Religion, and the Search for Extraterrestrial Intelligence

By David Wilkinson
Oxford University Press
Hardback: £25
Also available as an e-book

CCboo2_02_14

With doctorates in both astrophysics and theology, David Wilkinson is well qualified to discuss the subject matter of this book. He provides a captivating narrative on the scientific basis for the search for extraterrestrial intelligence and the religious implications of finding it. However, the academic nature of the writing might hinder the casual reader, with nearly every paragraph citing at least one reference.

Scientific and religious speculation on the possibility of life elsewhere in the universe is age-old. Wilkinson charts its history from the era of Plato and Democritus, where the existence of worlds besides our own was up for debate, to the latest data from telescopes and observatories, which paint vivid pictures of the many new worlds discovered around alien suns.

Readers familiar with astrophysics and evolutionary biology might find themselves skipping sections of the book that go into the specific conditions that need to be met for Earth-like life to evolve and attain intelligence. Wilkinson, however, is able to tie these varied threads together, presenting both the pessimism and optimism towards the presence of extraterrestrial life exhibited by scientists from different fields.

Despite referring to religion in the title, Wilkinson states early on that his work mainly discusses the relationship of Christianity and SETI. In this regard, the book provided me with much insight into Christian doctrine and its many – often contradictory – views on the universe. For example, despite the shaking of the geocentric perspective with the so-called Copernican Revolution, some Christian scholars from the era maintained that the special relationship of humans with God dictated that only Earth could harbour God-fearing life forms. Earth, therefore, retained its central position in the universe in a symbolic if not a literal sense. Other views held that nothing could be beyond the ability of an omnipotent, omnipresent God, who to showcase his glory might well have created other worlds with their own unique creatures.

After covering everything from science fiction to Christian creation beliefs, Wilkinson concludes with his personal views on the value of involving theology in searches for alien life. I leave you to draw your own conclusions about this! Overall, the book is a fascinating read and is recommended for those pondering the place of humanity in our vast universe.

Einstein’s Physics: Atoms, Quanta, and Relativity – Derived, Explained, and Appraised

By Ta-Pei Cheng
Oxford University Press
Hardback: £29.99
Also available as an e-book

CCboo1_02_14

Being familiar with the work of Ta-Pei Cheng, I started this book with considerable expectations – and I enjoyed the first two sections. I found many delightful discussions of topics in the physics that came after Albert Einstein, as well as an instructive discussion on his contributions to quantum theory, where the author shares Einstein’s reservations about quantum mechanics. However, the remainder of the text dedicated to relativity and related disciplines has problems. The two pivotal issues of special relativity, the aether and the proper time, provide examples of what I mean.

On p140, the author writes “…keep firmly in mind that Einstein was writing for a community of physicists who were deeply inculcated in the aether theoretical framework”, and continues “(Einstein, 1905) was precisely advocating that the whole concept of aether should be abolished”. Of course, Einstein was himself a member of the community “inculcated in the aether” and, indeed, aether was central in his contemplation of the form and meaning of physical laws. His position was cemented by the publication in 1920 of a public address on “Aether and the Theory of Relativity” and its final paragraph “…there exists an aether. According to the general theory of relativity space without aether is unthinkable; for in such space there not only would be no propagation of light, but also no possibility of existence for standards of space and time…”. This view superseded the one expressed in 1905, yet that is where the discussion in the book ends.

The last paragraph on p141 states that “…the key idea of special relativity is the new conception of time.” Einstein is generally credited with the pivotal discovery of “body time”, or in Hermann Minkowski’s terminology, a body’s “proper time”. The central element of special relativity is the understanding of the invariant proper time. Bits and pieces of “time” appear in sections 9–12 of the book, but the term “proper time” is mentioned only incidentally. Then on p152 I read “A moving clock appears to run slow.” This is repeated on p191, with the addition “appears to this observer”. However, the word “appears” cannot be part of an unambiguous explanation. A student of Einstein’s physics would say “A clock attached to a material body will measure a proper-time lifespan independent of the state of inertial motion of the body. This proper time is the same as laboratory time only for bodies that remain always at rest in the laboratory.” That said, I must add that I have never heard of doubts about the reality of time dilation, which is verified when unstable particles are observed.

Once the book progresses into a discussion of Riemannian geometry and, ultimately, of general relativity, gauge theories and higher-dimensional Kaluza–Klein unification, it works through modern topics of only marginal connection to Einstein’s physics. However, I am stunned by several comments about Einstein. On p223, the author explains how “inept” Einstein’s long proof of general relativity was, and instead of praise for Einstein’s persistence, which ultimately led him to the right formulation of general relativity, we read about “erroneous detours”. On p293, the section on “Einstein and mathematics” concludes with a paragraph that explains the author’s view as to why “…Einstein had not made more advances…”. Finally, near the end, the author writes on p327 that Einstein “could possibly have made more progress had he been as great a mathematician as he was a great physicist”. This is a stinging criticism of someone who did so much, for things he did not do.

The book presents historical context and dates, but the dates of Einstein’s birth and death are found only in the index entry “Einstein”, and there is little more about him to be found in the text. A listing of 30 cited papers appears in appendix B1 and includes only three papers published after 1918. The book addresses mainly the academic work of Einstein’s first 15 years, 1902–1917, but I have read masterful papers that he wrote during the following 35 years, such as “Solution of the field of a star in an expanding universe” (Einstein and Straus 1945 Rev. Mod. Phys. 17 120 and 1946 Rev. Mod. Phys. 18 148).

I would strongly discourage the target group – undergraduate students and their lecturers – from using this book, because in the part on special relativity the harm far exceeds the good. To experts, I recommend Einstein’s original papers.

ASACUSA produces first beam of antihydrogen atoms for hyperfine study

A beam of antihydrogen atoms has for the first time been successfully produced by an experiment at CERN’s Antiproton Decelerator (AD). The ASACUSA collaboration reports the unambiguous detection of antihydrogen atoms 2.7 m downstream from their production, where the perturbing influence of the magnetic fields used to produce the antiatoms is negligibly small. This result is a significant step towards precise hyperfine spectroscopy of antihydrogen atoms.

High-precision microwave spectroscopy of ground-state hyperfine transitions in antihydrogen atoms is a main focus of the Japanese-European ASACUSA collaboration. The research aims at investigating differences between matter and antimatter to test CPT symmetry (the combination of charge conjugation, C, parity, P, and time reversal, T) by comparing the spectra of antihydrogen with those of hydrogen, one of the most precisely investigated and best understood systems in modern physics.

One of the key challenges in studying antiatoms is to keep them away from ordinary matter. To do so, other collaborations take advantage of antihydrogen’s magnetic properties and use strong, non-uniform magnetic fields to trap the antiatoms long enough to study them. However, the strong magnetic-field gradients degrade the spectroscopic properties of the antihydrogen. To allow for clean, high-resolution spectroscopy, the ASACUSA collaboration has developed an innovative set-up to transfer antihydrogen atoms to a region where they can be studied in flight, far from the strong magnetic field regions.

In ASACUSA, the antihydrogen atoms are formed by loading antiprotons and positrons into the so-called cusp trap, which combines the magnetic field of a pair of superconducting anti-Helmholtz coils (i.e., coils with antiparallel excitation currents) with the electrostatic potential of an assembly of multi-ring electrodes (CERN Courier March 2011 p17). The magnetic-field gradient allows the flow of spin-polarized antihydrogen atoms along the axis of the cusp trap. Downstream there is a spectrometer consisting of a microwave cavity to induce spin-flips in the antiatoms, a superconducting sextupole magnet to focus the neutral beam and an antihydrogen detector. (The microwave cavity was not installed in the 2012 experiment.)

The detector, located 2.7 m from the antihydrogen-production region, consists of single-crystal bismuth germanium oxide (BGO) surrounded by five plates of plastic scintillator. Antihydrogen atoms annihilating in the crystal emit three charged pions on average, so the signal required consists of a coincidence between the crystal and at least two plastic scintillators. Simulations show that this requirement reduces the background, from antiprotons annihilating upstream and from cosmic rays, by three orders of magnitude.

The ASACUSA researchers investigate the principal quantum number, n, of the antihydrogen atoms that reach the detector, because their goal is to perform hyperfine spectroscopy on the ground state, n = 1. For these measurements, field-ionization electrodes were positioned in front of the BGO, so that only antihydrogen atoms with n < 43 or n < 29 reached the detector, depending on the average electric field. The analysis indicates that 80 antihydrogen atoms were unambiguously detected with n < 43, with a significant number having n < 29.

This analysis was based on data collected in 2012, before the accelerator complex at CERN entered its current long shutdown. Since then, the collaboration has also been preparing for the restart of the experiment at the AD in October this year. A new cusp magnet is under construction, which will provide a much stronger focusing force on the spin-polarized antihydrogen beam. A cylindrical high-resolution tracker and a new antihydrogen-beam detector are also under development. In addition, the positron accumulator will lead to an order of magnitude more positrons. The team eventually needs a beam of antihydrogen in its ground state (n = 1) so the updated experiment will employ an ionizer with higher fields to extract antihydrogen atoms that are in effect in the ground state.

Jets give clues to the geometry of proton–nucleus collisions

Studies of the centrality-dependence of jet production in proton–lead collisions in ATLAS at the LHC are yielding surprising results.

In both proton–ion and ion–ion collisions, many interesting phenomena are influenced by the initial geometry of the collision system. In proton–nucleus collisions, protons that strike the centre of the nucleus (central collisions) see a thicker nuclear target than those that strike the edge (peripheral collisions) and are more likely to undergo a hard scattering. Traditionally, the geometry of the collisions or “centrality” is characterized by a measure of the event activity. For this measurement in ATLAS, the centrality is defined by the total transverse energy in the forward calorimeter in the direction of the lead beam. A model that describes the expected geometric enhancement of hard-scattering rates is used to relate this experimental measure to a factor TA (Miller et al. 2007).

CCnew3_02_14

The quantity RCP is defined as the ratio of the per-event jet yields in different centrality bins divided by the ratio of corresponding TA factors, which account for the expected geometric enhancement. The left panels of the figure show RCP values as a function of the jet transverse-momentum, pT, for different ranges in jet rapidity, y*, in the centre-of-mass frame. Negative y* indicates the proton-going direction. The normalized jet ratio, RCP, is suppressed at high pT and large negative y* compared with the expectation from known nuclear effects, which would correspond to a value near unity in RCP. The suppression of jets is strongest in the most central collisions (0–10%) at the highest pT values. Additional studies, independent of the centrality definition, indicate that final-state energy-loss like that observed in ion–ion collisions (jet quenching) is unlikely to be the main source of the effect.

These results suggest either a correlation between hard-jet production and soft-particle production that breaks the traditional geometric paradigm, or that in proton–lead collisions the energy in the forward calorimeter is not a good measure of the centrality of the collision in events with jets.

The right panel of the figure shows the jet RCP as a function of pTcosh(y*), which corresponds closely to the jet’s energy. When recast in terms of this quantity, the RCP values from different y* intervals all fall along the same line. Whether this is coincidence or related to the underlying dynamics is not yet known.

These observations are among the most striking preliminary ATLAS results from the 2013 proton–lead run of the LHC. Further measurements are needed to uncover the mechanism underlying the observed soft–hard correlations.

CMS observes new single-top production mode

The top quark remains, nearly 20 years after its discovery by the experiments at Fermilab’s Tevatron, the heaviest particle known. Its production and decays continue to be the subjects of extensive studies, both at the Tevatron and at the LHC. While top quarks are predominantly produced in pairs through the strong interaction, the production of single top quarks is also possible by virtue of the electroweak interaction, albeit with a much smaller production cross-section. This latter production mechanism provides a unique window into the dynamics of the top quark.

Three different production channels for single top quarks can be distinguished: the t-channel, the s-channel and the W-associated channel (figure 1). After many years of intensive searches, the Tevatron experiment collaborations first reported the observation of singly produced top quarks in 2009. These initial searches were optimized for the t- and s-channel production modes combined. Later, the t-channel mode was individually established by experiments at the Tevatron and the LHC, while evidence for s-channel production was reported only recently by the D0 collaboration at the Tevatron, in July 2013.

CCnew5_02_14

Meanwhile, the third production mechanism, W-associated production, remained out of the Tevatron’s reach because of its small cross-section and the lack of a sufficiently distinctive signature. At the LHC, however, the production rate for top quarks is much higher. This allowed ATLAS and CMS to report evidence of W-associated production already with the data collected at a centre-of-mass energy of 7 TeV in 2011, although these measurements did not reach the 5σ gold standard for the solid observation of a new signal. Now, thanks to the data collected in 2012 with the LHC operating at a centre-of-mass energy of 8 TeV, the CMS collaboration has been able to complete the experimental picture of the family of single top-quark production mechanisms.

The main background to W-associated production comes from pair-produced top quarks, where the decay of one of the top quarks appears W-like, because the b quark into which it decays fails the b-tagging requirements. No single defining feature separates the two processes sharply, so several kinematic properties have been combined into a single multivariate discriminant by a boosted decision tree – a machine-learning technique that is particularly suited to separating tiny signals from overwhelming backgrounds. Figure 2 shows the discriminant from the boosted decision tree.

CCnew6_02_14

The amount of signal in the selected sample is inferred by a fit to the discriminant distribution. To constrain tightly the amount of top-quark pair background with the data, two complementary samples with one additional jet, separated into cases with one or both jets b-tagged, are also included in the fit. The relative population of the samples with one or two b-tagged jets – both of which are almost pure in top-quark pair events – proves to be a powerful handle to reduce the uncertainty on the b-tagging efficiency, and therefore improve the precision of the analysis.

The excess of events with respect to the background-only hypothesis is quantified at a significance of 6.1σ (with 5.4±1.4σ expected from simulation), and the cross-section is found to be 23.4±5.4 pb, in agreement with the Standard Model prediction. Two cross-check analyses, less sensitive but relying on fewer modelling assumptions, confirm the result, therefore further supporting the first-time observation of this new single-top production mode.

This result opens the door to future searches for anomalous interactions of the top quark with the W boson, in a production mode that brings complementary information to studies that have so far been performed only with selections optimized for the more abundant top-pair and single top-quark t-channel events.

bright-rec iop pub iop-science physcis connect