Comsol -leaderboard other pages

Topics

From ionization of air to beyond the LHC

CCvie1_06_12

In August, some 100 physicists will gather at Bad Saarow in Germany to celebrate the centenary of the discovery of cosmic rays by the Austrian scientist, Victor Hess. The meeting place is close to where Hess and his companions landed following their flight from Aussig during which they reached 5000 m in a hydrogen-filled balloon; Health and Safety legislation did not restrain them. Finding the rate of ion-production at 5000 m to be about three times that of sea level, Hess speculated that the Earth’s atmosphere was bombarded by high-energy radiation. This anniversary might also be regarded as the centenary of the birth of particle physics. The positron, muon, charged pions and the first strange particles were all discovered in cosmic rays between 1932 and 1947; and in 1938 Pierre Auger and colleagues showed, by studying cascade showers produced in air, that the cosmic-ray spectrum extended to at least 1015 eV, a claim based on the new ideas of QED.

Reviewing history, one is struck by how reluctant physicists were to contemplate particles other than protons, neutrons, electrons and positrons. The combination of the unexpectedly high energies and uncertainties about the validity of QED meant that flaws in the new theory were often invoked to explain observations that were actually evidence of the muon. Another striking fact is how many giants of theoretical physics, such as Bethe, Bhabha, Born, Fermi, Heisenberg, Landau and Oppenheimer, speculated on the interpretation of cosmic-ray data. However, in 1953, following a famous conference at Bagnères de Bigorre, the focus of work on particle physics moved to accelerator laboratories and despite some isolated discoveries – such as that of a pair of particles with naked charm by Kiyoshi Niu and colleagues in 1971, three years before the discovery of the J/ψ at accelerators – accelerator laboratories were clearly the place to do precision particle physics. This is not surprising because the beams there are more intense and predictable than nature’s: the cosmic-ray physicist cannot turn to the accelerator experts for help.

Cosmic rays remained – and remain – at the energy frontier but post-1953 devotees were perhaps over eager to show that particle-physics discoveries could be made with cosmic rays without massive collaborations. Cosmic-ray physicists preferred to march to the beat of their own drums. This led to attitudes that were sometimes insufficiently critical and the field became ignored or even mocked by many particle physicists. In the 30 years after Bagnères de Bigorre, a plethora of observations of dramatic effects were claimed, including Centauros, the Mandela, high-transverse momentum, the free quark, the monopole, the long-flying component and others. Without exception, these effects were never replicated because better cosmic-ray experiments were made or the relevant energies were superseded at machines. That many of the key results – good and bad – were hidden in the proceedings of the biennial International Cosmic Ray Conference did not help. Not that the particle-physics community has never made false claims: older readers will recall that in 1970 the editor of Physical Review Letters found it necessary to lay down “bump hunting” rules for those searching for resonances and, of course, the “split A2”.

However, another cosmic-ray “discovery” led to a change of scene. In 1983, a group at Kiel reported evidence for gamma rays of around 1015 eV from the X-ray binary, Cygnus X-3. Their claim was apparently confirmed by the array at Haverah Park in the UK and at tera-electron-volts energies at the Whipple Telescope in the US. Several particle physicists of the highest class were sucked into the field by the excitement. This led to the construction of the VERITAS, HESS and MAGIC instruments that have now created a new field of gamma-ray astronomy at tera-electron-volt energies. The construction of the Auger Observatory, the largest cosmic-ray detector ever built, is another major consequence. In addition to important astrophysics results, the instrument has provided information relevant to particle physics. Specifically, the Auger Collaboration has reported a proton–proton cross-section measurement at a centre-of mass energy of 57 TeV.

When the LHC began to explore the tera-electron-volt energy region, some models used by cosmic-ray physicists were found to fit the first rapidity-data as well as, if not better than, those from the particle-physics theorists. It is clear that there is more to be learnt about features of hadronic physics through studying the highest-energy particles, which reach around 1020 eV. Estimates of the primary energy that are made using hadronic models are significantly higher than those from the measurements of the fluorescence-light from air-showers, which give a calorimetric estimate of the energy that is almost independent of assumptions about particle physics beyond the LHC. Furthermore, the number of muons found in high-energy showers is about 30% greater than predicted by the models. The Auger Collaboration plans to enhance their instrument to extend these observations.

Towards the end of operations of the Large Electron–Positron collider at CERN, projects such as L3-Cosmics used the high-resolution muon detectors to measure muon multiplicities in showers. Now there are plans to do something similar through the ACME project, part of the outreach programme related to the ATLAS experiment at the LHC, but with a new twist. The aim is for cheap shower detectors of innovative design, paid for by schools, to be built above ATLAS – with students monitoring performance and analysing data. Overall, we are seeing another union of cosmic-ray and particle physics, different from that of pre-1953 but nonetheless one that promises to be as rich and fascinating.

Principles of Radiation Interaction in Matter and Detection (3rd edition)

By Claude Leroy and Pier-Giorgio Rancoita
World Scientific
Hardback: £153 $232
E-book: $302

41zesEncHwL._SX328_BO1,204,203,200_

Like its predecessors, this third edition addresses the fundamental principles of the interaction between radiation and matter and the principles of particle detection and detectors in a range of fields, from low to high energy, and in space physics and the medical environment. It provides abundant information about the processes of electromagnetic and hadronic energy deposition in matter, detecting systems, and performance and optimization of detectors, with additional information in the third edition. A part of the book is also directed towards courses in medical physics.

The Fundamentals of Imaging: From Particles to Galaxies

By Michael Mark Woolfson
Imperial College Press
Hardback: £65 $98
Paperback: £32 $48
E-book: £87 $127

71r0D2DMcWL

The range of imaging tools, both in the type of wave phenomena used and in the devices that utilize them, is vast. This book illustrates this range, with wave phenomena that cover the entire electromagnetic spectrum, as well as ultrasound, and devices that vary from those that simply detect the presence of objects to those that produce images in exquisite detail. The aim also is to give an understanding of the principles behind the imaging process and a general account of how those principles are utilized, without delving into the technical details of the construction of specific devices.

A Modern Introduction to Particle Physics (3rd edition)

By Fayyazuddin and Riazuddin
World Scientific
Hardback: £54 $82

413GV+kJZjL

The Pakistani brothers, who were both students of Abdus Salam, wrote the first edition of their book in 1992, based on lectures given in various places. Aimed at senior undergraduates or graduate students, it provides a comprehensive account of particle physics. Having first been updated in 2000, this latest edition contains many revised chapters, in particular those that cover subjects such as heavy flavours, neutrinos physics, electroweak unification, supersymmetry and string theory. Another addition is a substantial number of new problems. This self-contained book covers basic concepts and recent developments, as well as overlaps between astrophysics, cosmology and particle physics.

CERN’s accelerators, experiments and international integration 1959–2009. The European Physical Journal H 36 (4).

By Herwig Schopper et al. (ed.)
Springer

CCboo1_05_12

In 2009, CERN’s Proton Synchrotron (PS) reached its half century, having successfully accelerated protons to the design energy for the first time on 24 November 1959. Still in operation more than 50 years later, it is not only a key part of the injection chain to the LHC but also continues to supply a variety of beams to other facilities, from the Antiproton Decelerator to the CERN Neutrinos to Gran Sasso project. During its operation, the PS witnessed big changes at CERN; at the same time, particle physics itself advanced almost beyond recognition, from the days before quarks to the current reign of the Standard Model.

At the close of the anniversary year, CERN held a symposium in honour of the accelerator developments at CERN and the concurrent rise of the Standard Model: “From the PS to the LHC: 50 years of Nobel Memories in High-Energy Physics”. Fittingly, at the end of 2009, the LHC – the machine that everyone expects to take the first steps beyond the Standard Model – was just beginning to come into its stride after the first collisions in November.

Key players who had been close to all of these developments, including 13 Nobel laureates, came together for the symposium. Now, several of the talks have been written up and published in the latest edition of The European Physical Journal H – the journal launched in 2010 as a common forum for physicists, historians and philosophers of science. The edition also includes three additional articles that were invited to provide a more complete picture, by covering CERN’s Intersecting Storage Rings, the history of stochastic cooling and searches for the Higgs boson at the Large Electron-Positron (LEP) collider – which started up in 1989 and hence celebrated its 30th anniversary at the symposium.

Dip into the pages and you will find many gems: among the Nobel laureates, Jerome Friedman describes the work at SLAC that revealed the reality of quarks, which were unheard of in 1959; Jim Cronin revisits the early 1960s when he and his colleagues discovered CP violation; Jack Steinberger looks back at early experiences at CERN; Carlo Rubbia presents the story of the discovery of W and Z bosons at CERN; and Burt Richter recalls early ideas on LEP, from his days on sabbatical at CERN. On the accelerator side, the articles detail developments with the PS, as well as the highlights (and lowlights) of the construction and running of LEP. The invited article on stochastic cooling includes the work of Simon van der Meer, who shared the Nobel prize with Carlo Rubbia in 1984. Sadly, he was too ill to attend the symposium and passed away in March 2011.

All of the articles provide an interesting view of remarkable events through the reminiscences of people who were not simply “there”, but who played a big part in making them happen. They are a fascinating reminder of what particle physics was like in the past and well worth a read. They also reflect the different styles of the various individuals, but not so much, perhaps, as did the original presentations at the symposium. To get the full flavour, and to see all the participants, take a look at the recordings. There you will find still more gems.

IceCube observations challenge ideas on cosmic-ray origins

CCnew1_05_12

The IceCube collaboration, with a detector that looks at a cubic kilometre of ice at the South Pole, has searched for evidence of neutrinos associated with gamma-ray bursts (GRBs). They find none at a level 3.7 times lower than models predict, indicating that cosmic rays with energies above 108 TeV originate from some other source.

Where nature accelerates particles to 108 TeV has been one of the long-standing questions of extreme astrophysics. Although the flux of the highest-energy cosmic rays arriving at Earth is small, it pervades the universe and corresponds to a large amount of energy. Equally mysterious in origin, gamma-ray bursts (GRBs), some associated with the collapse of massive stars to black holes, have released a small fraction of a solar mass of radiation more than once a day since the Big Bang. The assumption is that they invest a similar amount of energy in the acceleration of protons, which explains the observed cosmic-ray flux. This leads to the 15-year-old prediction that when protons and gamma rays co-exist in the GRB fireball they photoproduce pions that decay into neutrinos. The prediction is quantitative (albeit with astrophysical ambiguities) because astronomers can calculate the number of photons in the fireball, and the observed cosmic-ray flux dictates the number of protons. Textbook particle physics then predicts the number of neutrinos.

With 5160 photomultiplier tubes, the IceCube experiment has transformed a cubic kilometre of Antarctic ice into a Cherenkov detector. Even while still incomplete, the instrument reached the sensitivity to observe GRBs, taking data with 40 and 59 of the final number of 86 photomultiplier strings. The measurement is relatively easy because it exploits alerts from the NASA’s Swift satellite and Fermi Gamma-Ray Space Telescope to look for neutrinos arriving from the right direction at the right time. The window is small enough to do a background-free measurement because accidental coincidence with a high-energy atmospheric neutrino is negligible.

During the periods of data-taking, some 307 GRBs had the potential to result in neutrinos that IceCube could detect. However, the experiment found no evidence for any neutrinos that could be associated with the GRBs. This implies either that GRBs are not the only sources of cosmic rays with energies exceeding 108 TeV or that the efficiency of neutrino production is much lower than has been predicted.

With GRBs on probation, the stock rises for the alternative speculation that associates supermassive black holes at the centres of galaxies with the enigmatic cosmic accelerators.

CMS discovers the Ξb*0

Display of typical event

The CMS experiment has discovered its first new particle. The new state is observed with a significance exceeding 5 σ and a mass of 5945.0 ± 2.8 MeV. This mass and the observed decay mode are consistent with its being the beauty-strange baryon known as Ξb*0.

Understanding the detailed spectroscopy of the various families of hadrons has been a quest of scientists ever since quarks were recognized as being the building blocks of protons, neutrons and other hadrons. Baryons are composed of three quarks and if they contain a beauty (b) quark and a strange (s) quark then they are members of the Ξb family. Depending on whether the third valence quark is a u or a d, the resulting baryon is either the neutral Ξb0 or the charged Ξb. While the charged and neutral lowest-mass states were already known, none of the heavier states had so far been seen. The newly discovered particle is probably the Ξb*0, with a total angular momentum and parity, JP = 3/2+. Its observation helps in understanding how quarks bind and in further validating the theory of strong interactions.

The observation was made in a data sample of 5.3 fb–1 proton–proton collisions at a centre-of-mass energy of 7 TeV, delivered by the LHC in 2011. Figure 1 shows a typical event, where a candidate Ξb*0 (also appropriately called the “cascade b baryon”) leads to a cascade of decays, Ξb*0 → Ξbπ+, Ξb → J/ΨΞ, J/Ψ → μ+μ, Ξ → Λ0π and Λ0 → pπ, ending in one proton, two muons, and three pions. The existence of the Ξb*0 is established by detecting all of these particles and measuring the charge, momentum and point of origin (the vertex) for each one. Requiring that the secondary decay vertices be displaced from the primary vertex reduces the background caused by random combinations of uncorrelated particles, which are copiously produced in high-energy proton–proton collisions.

New Baryon

The invariant-mass distribution of the J/ΨΞ pairs shows a clear peak corresponding to the Ξb signal, with a mass in good agreement with the world average. The Ξb*0 is expected to decay promptly to Ξbπ+ pairs, so candidates were sought by combining the reconstructed Ξb with a track (assumed to be a pion) coming from the primary vertex. To cancel measurement errors partially and so increase the sensitivity, the analysis looked at the mass difference Q = M(J/ΨΞπ+) – M(J/ΨΞ) – M(π). Figure 2 shows the mass difference for 21 events in the range 12 < Q < 18 MeV, which clearly exceed the 3.0 ± 1.4 events expected in the absence of a new particle.

The detection of this new particle was possible thanks only to the excellent tracking and vertexing capabilities of the CMS experiment, combined with high-purity dimuon triggers that identify decays of the J/Ψ meson “on the fly”, before storing the events. This measurement shows that CMS can unravel complicated chains of particle decays and bodes well for future discoveries of rare particles.

Dijets confirm the Standard Model

Dijet measurements provide an excellent tool not only to probe high transverse-momentum parton interactions to study QCD but also to look for signs of new phenomena beyond the Standard Model. Thanks to the outstanding performance of the LHC in 2011, the ATLAS experiment recorded nearly 30,000 events with dijet masses above 2 TeV and even observed dijet masses up to 4.6 TeV.

CCnew6_05_12

The collaboration has used the full 2011 data sample – corresponding to nearly 5 fb–1 of integrated luminosity – for a measurement of the dijet cross-section as a function of mass and rapidity difference. The data were first corrected for detector effects – paying particular care to the effect of possible multiple interactions per beam crossing – and the measured cross-sections were then compared with various predictions of QCD. While there are small deviations in some models at the higher end of the spectrum, overall the agreement with QCD is reasonably good.

QCD predicts that the cross-section falls steeply with dijet mass. New, as yet unobserved, particles would typically give rise to resonances or bumps on top of this smoothly falling spectrum. ATLAS observes no bumps, allowing limits to be set on a number of theories that predict such particles.

Angular distributions can also be used to search for deviations from the Standard Model. They are typically measured in bins of dijet mass, where the scattering angle is transformed into a variable known as χ (see figure). The Standard Model predicts that these distributions should be relatively flat, while many theories beyond the Standard Model predict a rise at low values of χ.

The measured distributions are found to be in agreement with QCD predictions, allowing limits to be set on various models for new physics. For one of these models, where quarks are no longer fundamental particles but are instead composite objects, this analysis sets a limit on the compositeness scale – the scale of the constituent binding energies – at 7.8 TeV.

Deferred triggering optimizes CPU use

Like all of the LHC experiments, LHCb relies on a tremendous amount of CPU power to select interesting events out of the many millions that the LHC produces every second. Indeed, a large part of the ingenuity of the LHCb collaboration goes into developing trigger algorithms that can sift out the interesting physics from a sea of background. The cleverer the algorithms, the better the physics, but often the computational cost is also higher. About 1500 powerful computing servers in an event filter farm are kept 100% busy when LHCb is taking data and still more could be used.

CCnew8_05_12

However, this enormous computing power is used less than 20% of the time when averaged over the entire year. This is partly because of the annual shutdown, so preparations are under way to use the power of the filter farm during that period for offline processing of data – the issues to be addressed include feeding the farm with events from external storage. The rest of the idle time is a result of the gaps between the periods when there are protons colliding in the LHC (the “fills”), which typically last between two and three hours, where no collisions take place and therefore no computing power is required.

This raises the question about whether it is somehow possible to borrow the CPU power of the idle servers and use it during physics runs for an extra boost. Such thoughts led to the idea of “deferred triggering”: storing events that cannot be processed online on the local disks of the servers, and later, when the fill is over, processing them on the now idle servers.

The LHCb Online and Trigger teams quickly worked out the technical details and started the implementation of a deferred trigger early this year. As often happens in online computing, the storing and moving of the data is the easy part, while the true challenge lies in the monitoring and control of the processing, robust error-recovery and careful bookkeeping. After a few weeks, all of the essential pieces were ready for the first successful tests using real data.

Depending on the ratio of the fill length to inter-fill time, up to 20% of CPU time can be deferred – limited only by the available disk space (currently around 200 TB) and the time between fills in the LHC. Buying that amount of CPU power would correspond to an investment of hundreds of thousands of Swiss francs. Instead, this enterprising idea has allowed an increase in the performance of its trigger, allowing time for more complex algorithms (such as the online reconstruction of KS decays) to extend the physics reach of the experiment.

Ramping up to higher luminosity

After a flying start, with the first stable beams at the new energy of 4 TeV on 5 April, the LHC successfully operated with 1380 bunches per beam – the maximum planned for 2012 – on 18 April. In the days that followed, the machine reached a record peak luminosity of about 5.6 × 1033 cm–2 s–1, with a bunch intensity of 1.4 × 1011 protons per bunch and a new highest stored energy of 120 MJ per beam.

As it entered a two-day machine-develop-ment period on 21–22 April, almost 1 fb–1 of data had been delivered to the experiments, a feat that took until June in 2011. The machine development focused on topics relevant for the 2012 physics-beam operation and was followed by a five-day technical stop, the first of the year.

The restart from 27 April onwards was slowed down by several technical faults that led to low machine availability and the ramp back up in intensity took longer than initially planned. LHC operation was further hampered by higher than usual beam losses in the ramp and squeeze. These required time to investigate the causes and to implement mitigation measures.

On 10 May the machine began running again with 1380 bunches and a couple of days later saw one of the year’s best fills, lasting for 13 hours and delivering an integrated luminosity of 120 pb–1 to ATLAS and CMS. By 15 May, after careful optimization of the beams in the injectors, the luminosity was back up to pre-technical-stop levels. The aim now is for steady running accompanied by a gentle increase in bunch intensity in order to deliver a sizeable amount of data in time for the summer conferences.

bright-rec iop pub iop-science physcis connect