Comsol -leaderboard other pages

Topics

Towards a Higgs boson: first steps in an incredible journey

Letters of intent

Letters of intent

The idea that the tunnel for the future Large Electron–Positron (LEP) collider should be able to house at some time even further in the future a Large Hadron Collider (LHC) was already in the air in the late 1970s. Moreover – thankfully – those leading CERN at the time had the vision to plan for a tunnel with a big enough diameter to allow the eventual installation of such an accelerator. In the broader community, however, enthusiasm for an LHC surfaced for the first time in 1984, promoted in part by members of CERN’s successful proton–antiproton collider experiments and their discovery of the W and Z bosons the previous year. A workshop in Lausanne on the “Large Hadron Collider in the LEP Tunnel” organized jointly by the European Committee for Future Accelerators (ECFA) and CERN brought together working groups that comprised machine experts, theorists and experimentalists.

With the realization of the great physics potential of an LHC, several motivating workshops and conferences followed where the formidable experimental challenges started to appear manageable, provided that enough R&D work on detectors could be carried out. Highlights of these “LHC experiment preliminaries” were the 1987 Workshop in La Thuile for the so-called “Rubbia Long-Range Planning Committee” and the large Aachen ECFA LHC Workshop in 1990. Last, in March 1992 the famous conference “Towards the LHC Experimental Programme” took place in Evian-les-Bains, where several proto-collaborations presented their designs in “expressions of interest”. Moreover, CERN’s LHC Detector R&D Committee (DRDC), which reviewed and steered R&D collaborations, greatly stimulated innovative developments in detector technology from the early 1990s.

Designs for a Higgs

The detection of the Standard Model Higgs boson played a particularly important role in the design of the general-purpose experiments. In the region of low mass (114 < mH < 150 GeV), the two channels considered particularly suited for unambiguous discovery were the decay to two photons and the decay to two Z bosons, where one or both of the Z bosons could be virtual. Because the natural width of the putative Higgs boson is < 10 MeV, the width of any observed peak would be entirely dominated by instrumental mass-resolution. This meant that in designing the general-purpose detectors, considerable care was placed on the value of the magnetic-field strength, on the precision-tracking systems and on high-resolution electromagnetic calorimeters. The high-mass region and signatures from supersymmetry drove the need for good resolution for jets and missing transverse energy, as well as for almost full 4π calorimetry coverage.

ATLAS detector

The choice of the field configuration determined the overall design. It was well understood that to stand the best chance of making discoveries at the new “magic” energy scale of the LHC – and in the harsh conditions generated by about a billion pairs of protons interacting every second – would require the invention of new technologies while at the same time pushing existing ones to their limits. In fact, a prevalent saying was: “We think we know how to build a high-energy, high-luminosity hadron collider – but we don’t have the technology to build a detector for it.” That the general-purpose experiments have worked so marvellously well since the start-up of the LHC is a testament to the difficult technology choices made by the conceivers and the critical decisions made during the construction of these experiments. It is noteworthy that the very same elements mentioned above were crucial in the recent discovery of a Higgs boson.

At the Aachen meeting in 1990, much discussion took place on which detector technologies and field configurations to deploy. At the Evian meeting two years later, four experiment designs were presented: two using toroids and two using high-field solenoids. In the aftermath of this meeting, lively discussions took place in the community on how to continue and possibly join forces. The time remaining was short because the newly formed peer-review committee, the LHC Committee (LHCC), had set a deadline for the submission of the letters of intent (LoI) of 1 October 1992.

The designs based on the toroidal configurations merged to form the ATLAS experiment, deploying a superconducting air-core toroid for the measurement of muons, supplemented by a superconducting 2 T solenoid to provide the magnetic field for inner tracking and by a liquid-argon/lead electromagnetic calorimeter with a novel “accordion” geometry. The two solenoid-based concepts were eventually submitted separately. Although not spelt out explicitly, it was clear to everyone that resources would permit only two general-purpose experiments. It took seven rounds of intense encounters between the experiment teams and the LHCC referees before the committee decided at its 7th meeting on 8–9 June 1993 “to recommend provisionally that ATLAS and CMS should proceed to technical proposals”, with agreed milestones for further review in November 1993. The CMS design centred on a single large-bore, long, high-field superconducting solenoid, together with powerful microstrip-based inner tracking and an electromagnetic calorimeter of novel scintillating crystals.

So, the two general-purpose experiments were launched but the teams could not have foreseen the enormous technical, financial, industrial and human challenges that lay ahead. For the technical proposals, many difficult technology choices now had to be made “for real” for all of the detector components, whereas the LoI had just presented options for many items. This meant, for example, that several large R&D collaborations sometimes had to give up on excellent developments in instrumentation that had been carried out over many years and to find their new place working on the technologies that the experimental collaborations considered best able to deliver the physics. Costs and resources were a constant struggle: they were reviewed over a period of many years by an expert costs-review committee (called CORE) of the LHCC. It was not easy for many bright physicists to accept that the chosen technology also had to remain affordable.

CMS detector

The years just after the LoI were also the time when the two collaborations grew most rapidly in terms of people and institutes. Finding new collaborators was a high priority on the “to do” list of the spokespeople, who became real frequent-flyers, conducting global “grand tours”. These included many trips to far-flung, non-European countries to motivate and invite participation and contributions to the experiments, in parallel (and sometimes even in competition) with CERN’s effort to get non-member state contributions to enable the timely construction of the accelerator. It is during this period that the currently healthy mix of wealthy and less-wealthy countries was established in the two collaborations, clearly placing a value on not only material contributions but also intellectual ones. One important event was the integration of a strong US community after the discontinuation of the Superconducting Supercollider in 1993, which had a notable impact on both ATLAS and CMS in terms of the final capabilities of these experiments.

The submissions of the technical proposals followed in December 1994 and these were approved in 1996. The formal approval for construction was given on 1 July 1997 by the then director-general, Chris Llewellyn Smith, based on the recommendations of the Research Board and the LHCC (by then at meeting number 27) after the first of a long series of Technical Design Reports and the imposing of a material cost ceiling of SwFr475 million. These were also the years when the formal framework was set up by CERN and all of the funding agencies, first in interim and finally via a detailed Construction Memorandum of Understanding, agreed on in the new, biannual Resources Review Boards.

To explore all our coverage marking the 10th anniversary of the discovery of the Higgs boson ...

Training young physicists: a 20-year success story

European Schools

The original CERN Schools of High-Energy Physics were established in the early 1960s at the initiative of Owen Lock, who played a leading role in their development over the next three decades. The first schools in 1962 and 1963 were one-week events organized at St Cergue in Switzerland, near CERN. However, from 1964 onwards the annual events – by then lasting two weeks – took place in other countries, generally in member states of CERN.

Starting in 1970, every second school was organized jointly with the Joint Institute for Nuclear Research (JINR), CERN’s sister organization in the Soviet Union. This collaboration between East and West, even during the Cold War, exemplified how a common interest in science could bring together people from different nations working in harmony with the common goal of advancing human knowledge.

With the changes in the political scene in Europe, and after discussions and an exchange of letters in 1991 between the directors-general of CERN and of JINR, it was agreed that future schools would be organized jointly every year and that the title should change to the European School of High-Energy Physics. In each four-year period, three schools would take place in a CERN member state and the fourth in a JINR member state.

In 1993, the first European School took place in Poland, a country that was a member both of CERN and JINR. The following three schools were held in Italy, Russia and France, with the event in Russia being considered the first to be organized in a JINR member state. The full list of host countries for the first 20 European Schools is shown in the box (overleaf). With the new schools series, Egil Lillestøl replaced Owen Lock as the director of the CERN Schools of Physics, continuing in this role until the 2009 event in Germany, after which he handed over responsibility to the current director, Nick Ellis.

Theory and phenomenology

The target audience for the European Schools is students in experimental high-energy physics who are in the final years of working towards their PhDs. Most of the courses teach theory and phenomenology, concentrating on the physics concepts rather than the details of calculations. This training is highly relevant for the students who will use it in interpreting the results of physics-data analysis, e.g. as they complete the work for their PhD theses. Even if experimental physicists do not usually perform advanced theory calculations, it is of great importance that they can follow and appreciate the published work of their theory colleagues and also have the necessary background to discuss the phenomenology. This last aspect is addressed particularly through the discussion sessions at the schools.

The scientific programme of the European Schools consists of typically four and a half hours of lectures each day (three lectures, each 90 minutes in duration, including questions), complemented by discussion sessions in groups of about 15–20 students with a discussion leader. The programme includes a poster session where many of the students present their own research work to the other participants, including the teachers and organizers. This way, students get to discuss their own work with some of the leading experts in the field.

A new development in the programme since the 2011 school is the inclusion of projects in which the students from each discussion group collaborate as a team to study in detail an experimental data analysis. With this, on top of the rest of the programme, the students say that they have to work really hard; nevertheless they still seem to enjoy the schools a great deal.

The focus of the schools is mainly on subjects closely related to experimental high-energy physics, so there are always core courses on topics such as field theory and the electroweak Standard Model, quantum chromodynamics, flavour physics and CP violation, neutrino physics, heavy-ion physics and physics beyond the Standard Model. Since 2009 there have also been lectures on practical statistics for particle physicists, which are particularly relevant to the day-to-day work of many of the students.

European Schools

The core courses are complemented by some more topical lectures, including in recent years the latest results from the LHC and their implications. The programme generally also includes lectures related to cosmology, given the important interplay with particle physics, e.g. in connection with dark matter. Last but not least, the directors-general of CERN and JINR often attend in person and give lectures on the scientific programmes of their respective organizations and their outlook for the coming years; this also gives them an opportunity to meet and discuss informally with some of the most promising young physicists in the field.

The scientific programme, including the choice of subjects to be covered and the selection of the lecturers and discussion leaders who will teach at the school, is decided by a small international organizing committee with representatives from CERN and JINR, together with the person from the host country who will serve as the local director for the school. The same body is in charge of selecting the students who will attend the school, based on the applications and letters of recommendation from the professors or supervisors of the candidates.

Poster session

Beyond the purely scientific objectives of the schools, the organizers aim to foster cultural exchange and “networking” between participants from different countries and regions. For this reason the students are assigned to shared twin-room accommodation, mixing people from different countries and regions. Similarly, the discussion groups are chosen to have a good mix of nationalities.

The collaborative student projects that were introduced in 2011 go beyond learning about a specific data analysis. Each group of students, with a little assistance from their discussion leader, has to select a published paper describing the analysis that they are to study; they then have to organize themselves to share the work with different individuals or sub-groups addressing distinct aspects of the analysis; they have to work as a team to prepare and rehearse a short talk summarizing what they have learnt; and they have to select a speaker to represent them. All of these skills are important for young physicists working in large international collaborations such as those that run the LHC experiments.

Geographical enlargement

The European Schools have served as a model for similar series that are now organized in other parts of the world. Since 2001 there have been schools every two years in Latin America, catering for the growing high-energy-physics community there. The most recent event was held on 6–19 March this year in Arequipa, Peru.

A second new series of schools – the Asia-Europe-Pacific School of High-Energy Physics – started last year. The first event was held in Japan and the next one is planned for India in 2014. As with the Latin-American Schools, these events will be held every second year, with a programme that is similar to the model of the European Schools.

Thus, the European Schools have inspired other series catering for the needs of young physicists in other parts of the world. This is part of CERN’s policy of geographical enlargement and its mission to support scientists from other parts of the world to increase their participation in high-energy physics in general and their collaboration with CERN in particular.

The European Schools continue to attract a large number of applications from highly qualified candidates, despite the emergence of many other excellent schools that offer alternative training. For example, the 2013 school, which takes place on 5–18 June in Hungary, was oversubscribed by more than a factor of two compared with the target of around 100 students. This implies a rigorous and highly competitive selection process, focusing on students with the most promise for an outstanding career in high-energy physics and who are at the optimum stage in their studies to benefit from the school.

Discussion session

Critical to the success of the schools are the lecturers and discussion leaders who teach there, selected for their qualities as first-class researchers and also as teachers. They come from institutes in many countries, including ones that are not member states of either CERN or JINR. The European Schools have benefited from the strong support, and often the presence as lecturers, of successive directors-general of both CERN and JINR. The organizers are extremely grateful to the many people from the worldwide high-energy-physics community who every year contribute to the success of the schools, a success that can be judged from the positive feedback received from the students who participate.

Host countries of European Schools

1993 Zakopane, Poland
1994 Sorrento, Italy
1995 Dubna, Russia
1996 Carry-le-Rouet, France
1997 Menstrup, Denmark
1998 St Andrews, United Kingdom
1999 &Ccirc;astá-Papierni&ccirc;ka, Slovakia
2000 Caramulo, Portugal
2001 Beatenberg, Switzerland
2002 Pylos, Greece
2003 Tsakhkadzor, Armenia
2004 Sant Feliu de Guíxols, Spain
2005 Kitzbühel, Austria
2006 Aronsborg, Sweden
2007 Trˇešt’, Czech Republic
2008 Herbeumont-sur Semois, Belgium
2009 Bautzen, Germany
2010 Raasepori, Finland
2011 Cheile Gradistei, Romania
2012 Anjou, France
2013 Parádfürdö, Hungary

Basic Concepts of String Theory

By Ralph Blumenhagen, Dieter Lüst and Stefan Theisen
Springer
Hardback: £72 €84.35 $99
E-book: £56.99 €67.82 $69.95

This new textbook features an introduction to string theory, a fundamental line of research in theoretical physics during recent decades. String theory provides a framework for unifying particle physics and gravity in a coherent manner and, moreover, appears also to be consistent at the quantum level. This sets it apart from other attempts at that goal. More generally, string theory plays an important role as a generator of ideas and “toy” models in many areas of theoretical physics and mathematics; the spin-off includes the application of mathematical methods, originally motivated by and developed within string theory, to other areas. For example, string theory helps in the understanding of certain properties of gauge theories, black holes, the early universe and heavy-ion physics.

CCboo3_05_13

Thus any student and researcher of particle physics should have some knowledge of this important field. The book under discussion provides an excellent basis for that. It encompasses a range of essential and advanced topics, aiming at mid – to high-level students and researchers who really want to get into the subject and/or would like to look up some facts. For beginners, who just want to gain an impression of what string theory is all about, the book might be a little hefty and deterring. It really requires a serious effort to master it, and corresponds to at least a one-year course on string theory.

The book offers a refreshing mix of basic facts and up-to-date research, and avoids giving too much space to formal and relatively boring subjects such as the quantization of the bosonic string. Rather, the main focus is on the construction and properties of the various string theories in 10 dimensions and their compactifications to lower dimensions; it also includes thorough discussions of D-branes, fluxes and dualities. A particular emphasis is given to the two-dimensional world-sheet, or conformal field-theoretical point of view, which is more “stringy” than the popular supergravity approach. Filling this important gap is one of the strengths of this book, which sets it apart from other recent, similar books.

This is in line with the general focus of the book, namely the unification aspect of string theory, whose main aim is to explain, or at least describe, all known particles and interactions in one consistent framework. In recent years, additional aspects of string theory have been become increasingly popular and important lines of research, including the anti-de-Sitter/conformal-field-theory (AdS/CFT) correspondence and the quantum properties of black holes. The book barely touches on these subjects, which is wise because even the basic material would be more than would fit into the same book. For these subjects, a second volume may be in order.

All in all, this book is a perfect guide for someone with some moderate prior exposure to field and string theory, who likes to get into the principles and technical details of string model construction.

Lectures on Quantum Mechanics

By Steven Weinberg
Cambridge University Press
Hardback: £40 $75

This is a beautifully written book that is crafted with precision and is full of insight. However, this is for most people not the book from which to learn quantum mechanics for the first time. The cover notes acknowledge this and the book is advertised as being “ideally suited to a one-year graduate course” and “a useful reference for researchers”. That is not to say that it deals only with advanced material – the theory is built up from scratch and the logical structure is quite traditional.

CCboo1_05_13

The book starts with a careful exposition of the early history and the Schrödinger-equation analysis of the hydrogen atom and the harmonic oscillator, before moving on to cover the general principles, angular momentum and symmetries. The middle part of the book is concerned with approximate methods and develops the theory starting from time-independent perturbations and ending with the general theory of scattering. The final part deals mainly with the canonical formalism and the behaviour of a charged particle in an electromagnetic field, including the quantization of the field and the emergence of photons. The final chapter covers entanglement, the Bell inequalities and quantum computing, all in a mere 14 pages.

Perhaps what distinguishes this book from the competition is its logical coherence and depth, and the care with which it has been crafted. Hardly a word is misplaced and Weinberg’s deep understanding of the subject matter means that he leaves no stone unturned: we are asked to accept very little on faith. Examples include Pauli’s purely algebraic calculation of the hydrogen spectrum, the role of the Wigner-Eckhart theorem in a proper appreciation of the Zeeman effect and in atomic selection rules, as well as the emergence of geometrical phases. There is also a thoughtful section on the interpretations of quantum mechanics.

Weinberg has a characteristic style – his writing is full of respect for the reader and avoids sensational comments or attempts to over-emphasize key points. The price we pay is that the narrative is rather flat but in exchange we gain a great deal in elegance and content – it is for the reader to follow Weinberg in discovering the joys of quantum mechanics through a deeper level of understanding: I loved it!

Stochastic Cooling of Particle Beams

By Dieter Möhl
Springer
Paperback: £31.99 €36.87 $39.50
E-book: £24.99 €29.74 $49.95

Over the past decades, stochastic cooling of particle beams has grown, thrived and led to breathtaking results in physics from accelerator labs around the world. Now, great challenges lie ahead in the context of future projects, which strive for highly brilliant secondary-particle beams. For newcomers and researchers alike, there is no better place to learn about stochastic cooling than this book.

CCboo2_05_13

Dieter Möhl was one of the foremost experts in the field; ever since the beginning of the adventure in the 1970s, in the team of Simon van der Meer at CERN. Here he has surpassed himself to produce a personal book based not only on his masterful lectures over the years, but also covering, in the proper context and depth, additional subjects that have previously been dispersed across the specialized literature. He goes further by illustrating concepts with his recent personal studies on future projects (e.g. the accumulator ring RESR for the FAIR project) and is well placed to suggest innovations (e.g. alternative methods for stacking and momentum cooling, “split-function” lattices). Insightful remarks based on his experience, invaluable calculation recipes, realistic numerical examples, as well as an excellent bibliography go together to round up the whole book.

In this self-contained book, Möhl provides a superb pedagogical and concise treatment of the subject, from fundamental concepts up to advanced subjects. He describes the analytical formalism of stochastic cooling, stressing, whenever important, its interplay with the machine hardware and beam diagnostics.

The first six chapters introduce the ingredients of the state of the art of stochastic cooling. With deep insight, Möhl explains in chapter 2 all of the different techniques for betatron and/or momentum cooling. This is the most thorough yet compact overview that I know of, a great service to system designers and operators. In both the time-domain and frequency-domain pictures, the reader is guided step by step and with great clarity into delicate aspects of the subject (for instance, the mixing and power requirements) as well as rather complex calculations (such as for betatron cooling, the feedback via the beam and the cooling by nonlinear pickups and kickers). A great help to newcomers and a handy reference for the experts comes in the form of the comprehensive summary on the pickup and kicker impedances in chapter 3 as well as the discussion of the Schottky noise in chapter 4.

Chapter 7 deals with the Fokker-Planck equation and remarkably summarizes its most important application, namely in modelling the beam accumulation by stochastic cooling. The notoriously difficult bunched-beam cooling, which is of great interest for future colliders, is lucidly reviewed in chapter 8.

Dieter Möhl had practically finished the book when he unexpectedly passed away. Throughout this work of reference, his modesty and generosity emerge together with the quintessence of stochastic cooling, as part of his legacy.

Novel Superfluids: Volume 1

By Karl-Heinz Bennemann and John B Ketterson (eds.)
Oxford University Press
Hardback: £125 $210

31ijw6T+s8L

This volume reports on the latest developments in the field of superfluidity. The phenomenon has had a tremendous impact on the fundamental sciences as well as a host of technologies. In addition to metals and the helium liquids, the phenomenon has now been observed for photons in cavities, excitons in semiconductors, magnons in certain materials and cold gasses trapped in high vacuum. It very likely exists for neutrons in a neutron star and, possibly, in a conjectured quark state at their centre. Even the universe itself can be regarded as being in a kind of superfluid state. All of these topics are discussed by experts in the respective subfields.

An Introduction to Non-Perturbative Foundations of Quantum Field Theory

By Franco Strocchi
Oxford University Press
Hardback: £55 $98.50

419Q1IYKsKL._SX343_BO1,204,203,200_

Quantum Field Theory (QFT) has proved to be the most useful strategy for the description of elementary-particle interactions and as such is regarded as a fundamental part of modern theoretical physics. In most presentations, the emphasis is on the effectiveness of the theory in producing experimentally testable predictions, which at present essentially means perturbative QFT. However, after more than 50 years of QFT, there is still no single non-trivial (even non-realistic) model of QFT in 3+1 dimensions, allowing a non-perturbative control. This book provides general physical principles and a mathematically sound approach to QFT. It covers the general structure of gauge theories, presents the charge superselection rules, gives a non-perturbative treatment of the Higgs mechanism and covers chiral symmetry breaking in QCD without instantons

Industrial Accelerators and Their Applications

By Robert W Hamm and Marianne E Hamm (eds.)
World Scientific
Hardback: £100
E-book: £127

industrial-accelerators-and-their-applications

This new book provides a comprehensive review of the many current industrial applications of particle accelerators, written by experts in each of these fields. Readers will gain a broad understanding of the principles of these applications, the extent to which they are employed and the accelerator technology utilized. It also serves as a thorough introduction to these fields for non-experts and laymen alike. Owing to the growing number of industrial applications, there is an increased interest among accelerator physicists and many other scientists worldwide in understanding how accelerators are used in various applications. Many industries are also doing more research on how they can improve their products or processes using particle beams.

Imaging gaseous detectors and their applications

By Eugenio Nappi and Vladimir Peskov
Wiley-VCH
Hardback: €139
Paperback: €124.99

CCboo1_04_13

For those who belong to the Paleozoic era of R&D on gas detectors, this book evokes nostalgic memories of the hours spent in dark laboratories chasing sparks under black cloths, chasing leaks with screaming “pistols”, taming coronas with red paint and yellow tape and, if you belonged to the crazy ones of Building 28 at CERN, sharing a glass of wine and the incredible maggoty Corsican cheese with Georges Charpak. Subtitle it “The sorcerer’s Apprentice”, and an innocent student might think they have entered the laboratory of Merlin: creating electrons from each fluttering photon, making magical mixtures of liquids, exotic vapours, funny thin films and all of the strange concoctions that inhabited the era of pioneering R&D and led step-by-step to today’s devices.

The historical memory behind this book recalls all sorts of gaseous detectors that have been dreamt up by visionary scientists over the past 50 years: drift chambers, the ambitious time-projection chamber, resistive plate chambers, ring-imaging Cherenkov counters, parallel-plate avalanche counters, gas electron multipliers, Micromegas, exotic micro-pattern gaseous detectors (MPGDs) and more. All are included, both the ones that behaved and the ones that did not pay off – providing no excuse for anyone to re-make mistakes after reading the book. All of the basic processes that populate gas counters are reviewed and their functioning and limitations are explained in a simple and concise manner offering, to the attentive reader, key secrets and the solutions to obviate hidden traps. From the basic ionization processes to the trickiness of the streamer and breakdown mechanism, from the detection of a single photon to the problems of high rates – only lengthy, hands-on experience supported by a profound understanding of the physics of the detection processes could bring together the material that this book covers. Furthermore, it includes many notable explanations that are crystal clear yet also suitable for the theoretical part of a high-profile educational course.

Coming to more recent times, the use of microelectronics techniques in the manufacturing process of gas counters has paved the road to the new era of MPGDs. The authors follow this route, the detector designs and the most promising future directions and applications, critically but with great expectation, leaving the reader confident of many developments to come.

Each of us will find in this book some corner of our own memory, the significance of our own gaseous detector in recent and current experiments, together with a touch of the new in exploring the many possible applications of gas counters in medicine, biology or homeland security and – when closing the book – the compelling need to stay in the lab. Chapeau!

AMS measures antimatter excess in space

CCnew1_04_13

The international team running the Alpha Magnetic Spectrometer (AMS) has announced the first results in its search for dark matter. They indicate the observation of an excess of positrons in the cosmic-ray flux. The results were presented by Samuel Ting, the spokesperson of AMS, in a seminar at CERN on 3 April, the date of publication in Physical Review Letters.

The AMS results are based on an analysis of some 2.5 × 1010 events, recorded over a year and a half. Cuts to reject protons, as well as electrons and positrons produced in the interactions of cosmic rays in the Earth’s atmosphere, reduce this to around 6.8 × 106 positron and electron events, including 400,000 positrons with energies between 0.5 GeV and 350 GeV. This represents the largest collection of antimatter particles detected in space.

The data reveal that the fraction of positrons increases from 10 GeV to 250 GeV, with the slope of the increase reducing by an order of magnitude over the range 20–250 GeV. The data also show no significant variation over time, or any preferred incoming direction. These results are consistent with the positrons’ origin in the annihilation of dark-matter particles in space but they are not yet sufficiently conclusive to rule out other explanations.

The AMS detector is operated by a large international collaboration led by Nobel laureate Samuel Ting. The collaboration involves some 600 researchers from China, Denmark, Finland, France, Germany, Italy, Korea, Mexico, the Netherlands, Portugal, Spain, Switzerland, Taiwan and the US. The detector was assembled at CERN, tested at ESA’s ESTEC centre in the Netherlands and launched into space on 16 May 2011 on board NASA’s Space Shuttle Endeavour. Designed to study cosmic rays before they interact with the Earth’s atmosphere, the experiment is installed on the International Space Station. It tracks incoming charged particles such as protons and electrons, as well as antimatter particles such as positrons, mapping the flux of cosmic rays with unprecedented precision.

An excess of antimatter within the cosmic-ray flux was first observed around two decades ago in experiments flown on high-altitude balloons and has since been seen by the PAMELA detector in space and the Large Area Telescope on the Fermi Gamma-ray Space Telescope. The origin of the excess, however, remains unexplained.

One possibility, predicted by theories involving supersymmetry, is that positrons could be produced when two particles of dark matter collide and annihilate. Assuming an isotropic distribution of dark-matter particles, these theories predict the observations made by AMS. However, the measurement by AMS does not yet rule out the alternative explanation that the positrons originate from pulsars distributed around the galactic plane. Moreover, supersymmetry theories also predict a cut-off at higher energies above the mass range of dark-matter particles and this has not yet been observed.

AMS is the first experiment to measure to 1% accuracy in space – a level of precision that should allow it to discover whether the positron observation has an origin in dark matter or in pulsars. The experiment will further refine the measurement’s precision over the coming years and clarify the behaviour of the positron fraction at energies above 250 GeV.

bright-rec iop pub iop-science physcis connect