Bluefors – leaderboard other pages

Topics

Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels

By Pauline Gagnon
MultiMondes
Hardback: €29
E-book: €19
Also available at the CERN bookshop

CCboo1_09_15

Pauline Gagnon est bien connue dans la communauté des expérimentateurs au LHC car, en plus de sa contribution à l’expérience ATLAS, elle a été membre du groupe de communication du CERN de 2011 à 2014 et sur le blog Quantum Diaries elle a couvert de nombreux évènements récents liés à l’activité scientifique du laboratoire.

Le titre de son livre rédigé en français, ” Qu’est ce que le Boson de Higgs mange en hiver ” est quelque peu trompeur, car les propos de l’auteur vont bien au delà de la description du mécanisme de Brout-Englert & Higgs et de la découverte expérimentale du boson de Higgs en 2012. Son livre offre non seulement une vue d’ensemble de la physique étudiée dans les expériences au LHC, du complexe d’accélérateurs et de détecteurs réalisés pour cette recherche et des méthodes statistiques employées pour la découverte du Boson de Higgs, mais inclut aussi un chapitre qui décrit l’organisation originale (et probablement unique) des grandes collaborations internationales en physique des hautes énergies ainsi qu’un chapitre sur les transferts de technologie et de connaissance de notre domaine vers le monde économique et le grand public.

Le livre décrit aussi les liens qui relient la physique des hautes énergies à l’astrophysique, avec un chapitre consacré aux évidences expérimentales qui ont amené à augurer de l’existence de la matière noire, et à une comparaison entre le potentiel de découverte de celle-ci par des expériences sur et hors accélérateurs. Un autre chapitre est consacré à la super-symétrie, la théorie actuellement la plus populaire au delà du modèle standard pour répondre aux questions que celui-ci ne peut résoudre, et aux défis qui attendent les expériences du LHC dans les prochaines années. Le livre se termine par la discussion d’un thème qui est quelque peu déconnecté mais cher au cœur de l’auteur, à savoir la question de la diversité (en particulier l’emploi des femmes) dans le monde de la recherche scientifique.

Le livre n’est pas destiné aux spécialistes mais cible le grand public. A cette fin, l’auteur a banni toute formule mathématique et utilise souvent des analogies pour introduire les différents concepts. Les parties plus complexes ou plus détaillées sont incluses dans des encarts séparés que le lecteur peut éventuellement sauter. Dans le même esprit, chaque chapitre se termine par un résumé d’une page environ qui permet une lecture abrégée du point traité, quitte à y revenir plus tard. Le style est simple et direct, avec souvent une pointe d’humour. Le discours n’est cependant pas superficiel, et il me semble que le livre s’adresse tout de même à des lecteurs avec une certaine connaissance scientifique de base, par exemple des jeunes étudiants qui veulent comprendre l’intérêt et les buts de la recherche en physique des particules.

The Singular Universe and the Reality of Time: A Proposal in Natural Philosophy

By Roberto Mangabeira Unger and Lee Smolin
Cambridge University Press
Hardback: $20
E-book: $17

CCboo2_09_15

This is a book on natural philosophy, a field that the authors argue, and convincingly so, has not had much activity for a long time. It is definitely not a popularisation, although it is written clearly enough (and free of equations) that it should be accessible to most knowledgeable readers.

In many ways, this is two books: one of about 350 pages by Unger, a philosopher, and another of about 150 pages by Smolin, a physicist, each presenting overlapping but often dissenting views, together with a discussion of these differences. This means one can be quite comfortable reading it and agreeing or not, as each point is raised.

Perhaps the key idea is that history might play a role in determining why the universe is the way it is, in as fundamental a way as history determines much of biology. This takes on many of the fundamental assumptions that go into cosmology and physics, including the idea that the “laws” of physics are somehow hard-wired into the universe and that they could conceivably evolve. Indeed in biology, the laws that govern biology emerge as the space of living things evolves. This puts causal connections in the driving seat and is akin to taking the Darwinian viewpoint in biology over the creationist myth. A new view emerges on why things are the way they are – an alternative to some hypothetical “elegant(?)” future derivation of why, for example, masses and couplings are what they are.

The authors eschew some ideas that often occur today, including that of there being a multiverse with ourselves being in but one (the “singular” in the title means there is just one), and the idea that time is somehow not real and leading to a genuine history. They even argue that mathematics may not merit the (“prophetic”, as they put it) role that we often give it.

It’s a hard book to put down. Whether or not one agrees with the points that are raised, the book is nothing if not thought-provoking, and the ideas could well be revolutionary.

A wealth of data for physics from the LHC: 1400 colliding bunches per beam and counting

Thanks to the work done during the LHC machine-development period and technical stop at the end of the summer, the LHC is enjoying a stable-intensity ramp-up period, which is giving experiments precious data for their physics programmes.

During the machine-development break at the end of August, a variety of measurement and development programmes were carried out in the machine. They included tests for exploring the limits of smaller beam sizes at the interaction points and studies of collimation using bent crystals. Highlights also included the validation of a 40 cm β*, which effectively doubles the luminosity potential of the present set-up. Free from the challenges of high beam intensity, machine availability was high during this remarkably successful machine-development period.

This period was followed by a five-day technical stop. The key objectives were modifications to the critical Quench Protection System, the consolidation of the cooling and electrical-distribution systems, and important maintenance work on the cryogenics system. A huge number of activities were involved to make the technical stop a success.

The effort paid off: since the end of the technical stop, the LHC has gone smoothly through a complete validation period with beam, which ensures that the machine is ready for the intensity ramp-up from a machine-protection standpoint.

The validation is obtained step-by-step and with increasing intensity, both in terms of the number of bunches and the particles in each bunch. The first step consists of running through a full LHC cycle, from injection to collisions and beam dump. This is done initially with a low-intensity bunch (“probe”) to check all of the machine settings and equipment. This phase is followed by a series of collimation- and absorber-validation tests at different points in the LHC cycle. Low-intensity beams – typically the equivalent of three nominal bunches (3 × 1011 protons) – are expanded transversely or longitudinally, or are de-bunched to verify that the collimators and absorbers are correctly intercepting lost particles. The techniques for those validations have been improved progressively, and they can now be performed within 24 h in a few machine cycles.

As soon as the protection systems were validated with the probe beam, the intensity of the beam was ramped up in three steps to 459 bunches per beam – the level that had been reached before the summer stop. Further intensity ramp-ups are performed stepwise: at each step, the LHC must be operated for at least three periods of stable collisions. This is equivalent to integrating at least 20 h of operation before the next intensity step can be authorised. At each step, operators carefully analyse the data collected across many systems, in particular those related to machine protection, and give the green light for an intensity step only when all of the systems show satisfactory performance.

Following this scheme, about 10 days after the end of the break, the machine could be operated with around 1000 bunches per beam and 25 ns bunch spacing, which is the LHC design bunch spacing.

In the present beam configuration, the electron-cloud activity is still significant and considerable power is deposited onto the vacuum-chamber beam screen. For good performance of the machine, the beam-screen temperature should remain below 30 K, and this is achieved by managing the heat-load transients. This operation is particularly delicate for the cryogenic-system operation team in the CERN Control Centre during the injection and energy ramp-up of beams.

The beam intensity of Run 2 can also be measured in terms of the energy stored in each beam: with more than 1000 bunches per beam, the stored energy in each beam exceeds 100 MJ. Towards the end of September, the machine reached 150 MJ, breaking the record of Run 1 (140 MJ).

CMS observes long-range correlations in pp collisions at 13 TeV

CMS

The CMS collaboration has published its first particle-correlation result from proton–proton (pp) collisions at a centre-of-mass energy of 13 TeV. The paper describes the observation of a phenomenon first seen in nucleus–nucleus collisions, and also detected by CMS in 2010 in the initial LHC pp collision run, at a centre-of-mass energy of 7 TeV. CMS later also observed the phenomenon in proton–lead (pPb) collisions at a centre-of-mass energy of 5 TeV per nucleon pair. The phenomenon is an unexpected correlation between pairs of particles appearing in so-called high-multiplicity collisions, which are collisions that produce a large number of particles, i.e. approximately more than 100 charged particles with transverse momentum pT > 0.4 GeV/c within the pseudorapidity region |η| < 2.4. The correlation manifests itself as a ridge-like structure in a 2D angular correlation function.

Following the CMS observation at 7 TeV, interest was expressed concerning the dependence of this phenomenon on the centre-of-mass energy. To more readily address this question, CMS collected a special 13 TeV data set, with an integrated luminosity of 270 nb–1. Here, the average number of simultaneous collisions in a beam bunch crossing was as low as about 1.3, presenting conditions similar to those used for the 7 TeV analysis. Because the effect is expected to appear only in high-multiplicity events, a special trigger was developed based on the number of charged particles detected in the silicon tracker system.

Indeed, about once in every 3000 pp collisions with the highest produced particle multiplicity at 13 TeV, CMS observes an enhancement of particle pairs with small relative azimuthal angle Δφ (figure 1). It therefore appears that charged particles have a slight preference to be emitted pointing in nearly the same azimuthal direction, even if they are very far apart in terms of polar angle, which is measured by the quantity η.

Such correlations are reminiscent of effects first seen in nucleus–nucleus collisions at Brookhaven’s RHIC and later in collisions of lead–lead nuclei (PbPb) at the LHC. Nucleus–nucleus collisions produce a hot, dense medium similar to the quark–gluon plasma thought to have existed in the first microseconds after the Big Bang. The long-range correlations in PbPb collisions are interpreted to result from a hydrodynamic expansion of this medium. Such a medium was not expected in the simpler pp system, and therefore the CMS results from 2010 led to a variety of theoretical models aiming for an explanation.

Remarkably, the new 13 TeV results demonstrate that, within the experimental uncertainties, the strength of the correlation (expressed in terms of associated particle yield) does not depend on the centre-of-mass energy of the pp collision but only on the particle multiplicity. This lack of energy dependence is similar to what is observed for hydrodynamic-flow coefficients measured in nucleus–nucleus collisions at RHIC and the LHC. Compared with the pp results, pPb and PbPb collisions produce correlations that are four and 10 times stronger, respectively, but which are qualitatively very similar to the pp results. The new results from pp collisions extend the measurements to much higher multiplicities compared with those at 7 TeV, and provide the opportunity to understand this curious phenomenon better.

Supersymmetry searches: the most comprehensive ATLAS summary to date

ATLAS

ATLAS has summarised 22 Run 1 searches, using more than 310,000 models to work out where the elusive SUSY particles might be hiding.

The first run of the LHC taught us at least two significant things. First, that there really is a Higgs boson, with properties broadly in line with those predicted by the Standard Model. Second, that the hotly anticipated supersymmetric (SUSY) particles – which were believed to be needed to keep the Higgs boson mass under control – have not been found.

If, as many believe, SUSY is the solution to the Higgs-mass problem, there should be a heavy partner particle for each of the familiar Standard Model fermions and bosons. So why have we missed the super partners? Are they not present at LHC energies? Or are they just around the corner, waiting to be found?

ATLAS has recently taken stock of its progress in addressing the question of the missing SUSY particles. This herculean task examined an astonishing 500 million different models, each representing a possible combination of SUSY-particle masses. The points were drawn from the 19 parameter “phenomenological Minimal Supersymmetric Standard Model (pMSSM)” and concentrated on those models that can contribute to the cosmological dark matter.

The ambitious project involved the detailed simulation of more than 600 million high-energetic proton–proton collisions, using the power of the LHC computing grid. Teams from 22 individual ATLAS SUSY searches examined whether they had sensitivity to each of the 310,000 most promising models. This told them which combinations of SUSY masses have been ruled out by the ATLAS Run 1 searches and which masses would have evaded detection so far.

The results are illuminating. They show that in Run 1, ATLAS had particular sensitivity to SUSY particles with sub-TeV masses and with strong interactions. Their best constraints are on the fermionic SUSY partner of the gluon and, to a lesser extent, on the scalar partners of the quarks. Weakly interacting SUSY particles have been much harder to pin down, because those particles are produced more rarely. The conclusions are broadly consistent with those obtained using simplified models, which are being used to guide Run 2 SUSY searches.

The paper goes on to examine the knock-on effects of the ATLAS searches for other experiments. The ATLAS searches constrain the SUSY models that are being hunted by underground searches for dark-matter relics, and by indirect searches, including those measuring rare B-meson decays and the magnetic moment of the muon.

Today, the higher-energy of the 13 TeV LHC is bringing increased sensitivity to rare processes and to higher-mass particles. The ATLAS physics teams are excited to be using their fresh knowledge about where SUSY might be hiding to start the hunt afresh.

LHCb determines the electroweak mixing angle

The electroweak mixing angle, θW, is a fundamental parameter of the Standard Model; it quantifies the relative strengths of electromagnetism and the weak force, and governs the Z-boson couplings to fermions. It is also something of a puzzle. The two most accurate determinations of the angle, carried out at LEP and SLD, are some three standard deviations different. More recent determinations at the Tevatron experiments, and by ATLAS and CMS at the LHC, have started to probe the difference. Now, LHCb has published a measurement based on LHC data taken in the forward region.

LHCb has measured the asymmetry in the angular distribution of muons in dimuon final states, AFB, as a function of dimuon mass. The asymmetry depends on the squared sine function of the electroweak mixing angle, sin2θW, and can be used to determine a value for it once the directions of the interacting quark and antiquark, needed to define the sign of the asymmetry, are known. LHCb’s unique kinematic region benefits the analysis; dilution of the asymmetry is reduced as the incoming quark direction can be identified correctly 90% of the time, and theoretical uncertainties due to parton-density functions are lower than in the central region. In addition, LHCb’s ability to swap the direction of its magnetic field allows many valuable cross-checks to be performed.

An example of the angular asymmetry, for data taken at 8 TeV centre-of-mass energies, is shown in figure 1 as measurement points compared with a (shaded) Standard Model prediction. The effective electroweak mixing angle is found by comparing this asymmetry distribution with a series of Standard Model templates, corresponding to a range of values of angle, and choosing the one that best matches data. The analysis is performed on both the 7 and 8 TeV data sets, and the results are combined. The corresponding value of sin2θeffW is determined to be 0.23142±0.00073 (stat.)±0.00052 (sys.)±0.00056 (theory).

The value is one of the most precise measurements obtained at a hadron collider. Its accuracy is limited currently by statistics, and does not allow yet for a final word to be said on previous results from LEP, SLD, Tevatron and the LHC. In LHC Run 2 and beyond, there is scope to not just increase the number of events that can be analysed, but for improved parton-density functions (which dominate the theoretical error) to become available. The measurement should improve much further.

Novel radionuclides to kill cancer

A new radiolabelled molecule obtained by the association of a 177Lu isotope and a somatostatin-analogue peptide is showing potential as a cancer killer for certain types of tumour. It is being developed by Advanced Accelerator Applications (AAA), a radiopharmaceutical company that was set up in 2002 by Stefano Buono, a former CERN scientist. With its roots in the nuclear-physics expertise acquired at CERN, AAA started its commercial activity with the production of radiotracers for medical imaging. The successful commercial activity made it possible for AAA to invest in nuclear research to produce innovative radiopharmaceuticals.

177Lu emits both a β particle, which can kill cancerous cells, and a γ ray, which can be useful for SPECT (Single-Photon Emission Computed Tomography) imaging. Advanced neuroendocrine tumours can be inoperable, and for many patients there are no therapeutic options. However, about 80% of all neuroendocrine tumours overexpress somatostatin receptors, and the radiolabelled molecule is able to selectively target those receptors. The new radiopharmaceutical acts by releasing the high-energy electrons after internalization in the tumour cells through the receptors. The tumour cells are destroyed by the radiation, and the drug is rapidly cleared from the body via urine. A complete treatment consists of only four injections, one every six to eight weeks.

The radiolabelled molecule is currently being used for the treatment of all neuroendocrine tumours on compassionate-use and named-patient basis in 10 European countries, and is seeking approval in both the EU and the US. A phase-III clinical trial (the NETTER-1 clinical study) conducted in 51 clinical centres in the US and Europe, is testing the product in patients with inoperable, progressive, somatostatin-receptor-positive, mid-gut neuroendocrine tumours. The results of this trial were presented on 27 September in a prestigious Presidential Session at the European Cancer Congress in Vienna, Austria. The NETTER-1 trial demonstrated that there is a statistically significant and clinically meaningful increase in progression-free survival in patients treated with the radiolabelled molecule, compared with patients treated under the current standard of care. The median progression-free survival (PFS) was not reached during the duration of the trial in the Lutathera arm and was 8.4 months in the comparative group (p < 0.0001, hazard ratio: 0.21).

Another labelling radionuclide, the 68Ga positron emitter, is a good candidate in the production of a novel radiotracer to be used in the precise diagnosis and follow-up of the family of diseases using PET (positron emission tomography).

XENON100 sees no evidence of dark-matter interactions with electrons in liquid xenon

Nearly 400 days of data taken by the XENON collaboration were used to look for the telltale signature of dark matter, an event rate that varies periodically over the course of a year.

The null result of this search – the first of its kind using a liquid-xenon detector – strongly challenges dark-matter interpretations of the annual modulation observed by the DAMA/LIBRA experiments. Both subterranean experiments are operated at the Laboratori Nazionali del Gran Sasso (LNGS).

An annually varying flux of dark matter through the Earth is expected due to the Earth’s orbital motion around the Sun, which results in a change of relative velocity between the Earth and the dark-matter halo thought to encompass the Milky Way. The observation of such an annual modulation is considered to be a crucial aspect of the direct detection of dark matter.

The DAMA/LIBRA experiments have observed an annual modulation of the residual rate in their sodium-iodide detectors since 1998. However, previous null results from several experiments searching for dark-matter-induced nuclear recoils, including XENON100, have challenged such an interpretation of the DAMA/LIBRA signal.

An alternative explanation, that the DAMA/LIBRA signal is instead due to dark-matter interactions with electrons, is challenged strongly by the new results from XENON100. In studies recently published in Science and Physical Review Letters, three models that predict dark-matter interactions with electrons were considered. The very low rate of electronic recoils in XENON100 allowed these models to be ruled out with high probability.

The studies highlight the overall stability and low background of XENON100, a landmark performance achieved with this type of technology so far. Liquid-xenon detectors continue to lead the field of direct dark-matter detection in terms of their sensitivity to these rare processes. The commissioning of the next generation of XENON experiments at the underground site in LNGS is nearing completion. The detector, XENON1T, is expected to be 100 times more sensitive than its predecessor, and will hopefully shed more light on the elusive nature of dark matter.

Weblink

• arxiv.org/abs/1507.07748

Borexino finds evidence of neutrinos produced in the Earth’s mantle

In July, the Borexino collaboration reported a geoneutrino signal from the Earth’s mantle with 98% C.L. Geoneutrinos are electron antineutrinos produced by β decays of 238U and 232Th chains, and 40K. These isotopes are naturally present in the interior of the Earth and have lifetimes compatible with the age of the planet. Their radioactive decays contribute significantly to the heat released by the planet. Therefore, the detection of antineutrinos can give geophysicists key information about the relative distribution of the various components in specific layers of the Earth’s interior (crust and mantle).

In Borexino, geoneutrinos are detected in the 278 tonnes of ultra-pure organic liquid scintillator via the inverse β-decay process, ν+ p → e+ n, with a threshold in the neutrino energy of 1.806 MeV. Data reported in the recent publication were collected between 15 December 2007 and 8 March 2015 for a total of 2055.9 days before any selection cut. In this data set, the total geoneutrino signal (from the crust and mantle) has been measured for the first time at more than 5σ.

The signal disentanglement from background is obtained by applying selection cuts based on the properties of the interaction process. The combined efficiency of the cuts, determined by Monte Carlo techniques, is estimated to be (84.2±1.5)%. A total of 77 antineutrino candidates survived the cuts. They include signals from the Earth and background events. The latter are mainly composed of antineutrinos coming from the nuclear reactors. Their signal, corresponding to some 53 events, has been calculated and based on the data from the International Atomic Energy Agency. From previous studies, the contribution from the crust is estimated to be (23.4±2.8) terrestrial neutrino units (TNU), corresponding to 13 events. To estimate the significance of a positive signal from the mantle, the collaboration has determined the likelihood of Sgeo(mantle) = Sgeo – Sgeo(crust) using the experimental likelihood profile of Sgeo and a Gaussian approximation for the crust contribution. This approach gives a signal from the mantle equal to Sgeo(mantle) = 20.9+15.1–10.3 TNU (corresponding to 11 events), with the null hypothesis rejected at 98% C.L.

Although limited by the detection volume and the exposure time, the Borexino researchers could also perform spectroscopy studies (figure 1) that show how their detection technique allows separation of the contributions from uranium (the dark-blue area) and thorium (the light-blue area).

CALET joins the International Space Station

After a spectacular launch from the Tanegashima Space Center on 19 August on board the Japanese H2-B rocket operated by the Japan Aerospace Exploration Agency (JAXA), the CALorimetric Electron Telescope (CALET) docked on the International Space Station on 24 August (EDT). From its privileged position at 400 km altitude, CALET will perform long-duration observations of high-energy charged particles and photons coming from space.

CALET is a space mission led by JAXA, with the participation of the Italian Space Agency and NASA. It is a CERN-recognised experiment and the second high-energy astroparticle experiment installed on the International Space Station (ISS) after AMS-02, which has been taking data since 2011. After berthing with the ISS, CALET was extracted by a robotic arm from the Japanese H-II transfer vehicle and installed on the external platform JEM-EF of the Japanese module. The instrument is now completing its check-out phase. Dedicated calibration runs will precede the start of the science data-taking period, which is expected to continue for several years.

CALET is a space observatory designed to identify electrons, nuclei and γ rays coming from space, and to measure their energies. A high-resolution measurement of the energy is provided by a deep, homogeneous calorimeter preceded by a high-granularity pre-shower calorimeter with imaging capabilities. To ensure very accurate calibration of the calorimetric instruments, the CALET collaboration has carried out several calibration tests at CERN, the most recent one in February 2015.

CALET’s science programme includes measurement of the detailed shape of the electron spectrum above 1 TeV. High-energy electrons are expected to originate less than a few thousand light-years from Earth, because they are known to lose energy quickly when travelling in space. Their detection might be able to reveal the presence of nearby astronomical source(s) where electrons are accelerated. The high end of the spectrum will be particularly interesting to scientists because it will help to resolve the interpretation of the electron and positron spectra reported by AMS-02, and could provide a clue to possible signatures of dark matter.

Thanks to its excellent energy resolution and ability to identify cosmic nuclei from hydrogen to beyond iron, CALET will also be able to study the hadronic component of cosmic rays. The collaboration will investigate the deviation from a pure power law that has been observed recently in the energy spectra of light nuclei, extending the present data to higher energies and measuring accurately the curvature of the spectrum as a function of energy. CALET will also measure the abundance ratio of secondary to primary nuclei, an important ingredient to understand cosmic-ray propagation in the Galaxy.

bright-rec iop pub iop-science physcis connect