Comsol -leaderboard other pages

Topics

How the hippies saved physics: science, counterculture, and the quantum revival

By David Kaiser
W W Norton & Company
Hardback: £17.99 $26.95
Paperback: $17.95

CCboo3_06_12

In this curious book, David Kaiser presents a detailed “biography” of a group of young physicists, the “Fundamental Fysics Group”, based in Berkeley, California, and their unconventional impact on the development of “the new quantum age”. Most of the action takes place in the 1970s and includes a surprising mixture of characters and plots, as suitably summarized in these illuminating words: “Many of the ideas that now occupy the core of quantum information science once found their home amid an anything-goes counterculture frenzy, a mishmash of spoon-bending psychics, Eastern mysticism, LSD trips, CIA spooks chasing mind-reading dreams and comparable ‘Age of Aquarius’ enthusiasms.” These people regularly gathered to discuss all sorts of exotic topics, including telepathy and “remote viewing”, as well as faster-than-light communication and the fundamental concepts of quantum theory.

Among many other things, I liked learning about early discussions regarding Bell’s theorem, the Einstein-Podolsky-Rosen paradox and the nature of reality, sometimes taking place in workshops with sessions in hot baths, interspersed by drum playing and yoga exercises. I also enjoyed reading about the first experimental tests of Bell’s work by John Clauser and about the genesis of the bestseller The Tao of Physics, by Fritjof Capra. It was particularly interesting to learn about a paper on superluminal communication (published despite negative reports from referees), which triggered the development of rebuttal arguments that ended up being quite revolutionary and leading to quantum encryption etc. It was thinking outside the “establishment” way that led to a wrong but fruitful idea about implications of Bell’s theorem, which forced others to improve the understanding of quantum entanglement and gave rise to a new and highly successful branch of physics: quantum information. Kaiser’s basic message is that, sometimes, crazy ideas push the understanding of science beyond the frontiers set by people working in conventional environments, within universities, and by government grants.

I know that we should not judge a book by its cover but with such a title I expected this book to be an interesting summertime read and was surprised to find that it is written in a rather heavy style that is more suitable for historians of science than for physicists relaxing on the beach. The topic of the book is actually quite curious, the language is fluid and the narrative is well presented but the level of detail is such that many readers will often feel like jumping ahead. It is elucidating to note that almost 25% of the book’s 400 pages are devoted to listings of notes and of bibliography. Essentially every sentence, every paragraph, is justified by an “end note”, which is an overkill for a book targeting a general audience. Writing this dense book must have been a long-term job for Kaiser, who is both a physicist and a historian. The result does not really qualify as an easy read. I enjoy reading biographies if they have a nice rhythm, some suspense and a few anecdotes here and there – which is not exactly the case for this book. I wonder how many readers end up moving it aside after realizing that they have been misled by the spirited title?

Powering the Future: How We Will (Eventually) Solve the Energy Crisis and Fuel the Civilization of Tomorrow

By Robert Laughlin
Basic Books
Hardback: £17.99 $24.99

CCboo1_06_12

Nearly 90% of the world’s economy is driven by the massive use of fossil fuels. The US spends one-sixth of its gross domestic product on oil alone, without counting the important costs of coal and natural gas, even though its use of oil and the other fossil fuels has progressively decreased since the mid-1970s. While the debate on fossil fuels continues to rage on both sides of the Atlantic, Robert Laughlin, professor of physics at Stanford University and Nobel Laureate for the fractional Hall effect, has written Powering the Future – a hypothetical voyage through the future, where the human race will have demands and expectations similar to those of today but where technologies will probably be quite different.

The book is essentially one of two halves. The first half contains the main chapters, where all of the essential statements and the logical lines of the various arguments are developed with an informal style. These are then complemented by the second half, which consists of a delightful set of notes. The notes encourage readers to form their own opinions on specific subjects using a number of tools, which range from assorted references to simplified quantitative estimates.

Treatises on energy problems that are written by political scientists are often scientifically inaccurate; specialized monographs are sometimes excessively technical. This book uses an intermediate register where the quantitative aspects of a problem are discussed but the overall presentation is not pedantic. Of the numerous examples, here are two short ones. What is the total precipitation that falls in one year on the world? The answer is “one metre of rain, the height of a golden retriever” (page 7 and note on page 127). What is the power-carrying capacity for the highest voltage currently used in North America? The answer is “2 billion watts” (page 46 and note on page 156) and is derived with simple mathematical tools.

Laughlin’s chain of arguments forms a composite approach to the energy challenge, where fossil fuels will still be needed 200 years from now to fly aeroplanes. Nuclear power plants will inevitably (but cautiously) be exploited and solar energy will offer decisive solutions in limited environments (see chapter nine, “Viva Las Vegas!”). While the author acknowledges that market forces (and not green technology) will be the future driver of energy innovation, the book does not explicitly support any partisan cause but tries to inspect thoroughly the issues at stake.

A few tweets may not suffice to develop informed views on the energy future of the human race. On the other hand, Powering the Future will certainly stimulate many readers (including, I hope, physicists) to form their own judgements and to challenge some of the canned statements that proliferate on the internet these days.

The History of Mathematics: A Very Short Introduction

By Jacqueline Stedall
Oxford University Press
Paperback: £7.99 $11.95

CCboo2_06_12

What a wonderful surprise. I was going to review another book before this one but it wasn’t to my liking (actually it was pretty bad) and I gave up after the first few chapters. So I settled instead on this book, mainly because it is short, or “very short” as the subtitle suggests.

Seeing that it was part of a series, I was expecting a typical history starting with Pythagoras and Euclid, then Newton and possibly Leibniz, Euler, Gauss and Riemann, followed by a collection of moderns, depending on how much space was left. I looked in the (excellent) index at the back (opening Q–Z) and was surprised to find no entry for Riemann. Was this British bias? No, Hardy was missing as well – but instead there were other people who I’d never heard of: William Oughtred, for example, (author of the first maths book published in Oxford) and Etienne d’Espagnet (who supplied Fermat with essential earlier works). Samuel Pepys also makes an appearance but more as an example of how little maths educated people knew in the 17th century.

I learnt in this charming book that what I had been expecting is called the “stepping stone” approach to the history of mathematics, focusing on elite mathematicians. This book is refreshingly different. It is actually more about the subject “history of mathematics”, i.e. about how we compile and recount a history of mathematics rather than about a sequence of events. However, it does this by focusing on intriguing stories that show the various features that must be considered. In doing so, it fills in the water between the stepping stones, for example, in the story of Fermat’s last theorem. It also tells the story of the majority of people who actually do maths – schoolchildren – by discussing the class work in a Babylonian classroom (around 1850 BC), as well as in a Cumbrian classroom around 1800.

After reading this “preview version”, I am now going to get the “director’s cut” – The Oxford Handbook of the History of Mathematics, which is co-authored by the same author with Eleanor Robson.

Happy reading and exploring!

LHC delivers for the summer conferences

CCnew1_06_12

With more luminosity delivered by the LHC between April and June 2012 than in the whole of 2011, the experiments had just what the collaborations wanted: as much data as possible before the summer conferences. By the time that a six-day period of machine development began on 18 June, the integrated luminosity for 2012 had reached about 6.6 fb–1, compared with around 5.6 fb–1 delivered in 2011.

The LHC’s performance over the preceding week had become so efficient that the injection kicker magnets – which heat up while beams continue to pass through them as they circulate – did not have time to cool down between fills. The kickers lose their magnetic properties when the ferrites at their centres become too hot, so on some occasions a few hours of cool-down time had to be included before beam for the next fill could be injected.

As the time constants for warming up and cooling down are both of the order of many hours, the temperature of the magnets turns out to provide a good indicator of the LHC’s running efficiency. The record for luminosity production of more than 1.3 fb–1 in a single week corresponds well with the highest measured kicker-magnet temperature of 70°C. A programme is now under way to reduce further the beam impedance of the injection kickers, which should substantially reduce the heating effect in future.

Routine operation of the LHC for physics is set to continue over the summer, with the machine operating with 1380 proton bunches in each beam – the maximum value for this year – and around 1.5 × 1011 protons a bunch. The higher beam energy of 4 TeV (compared with 3.5 TeV in 2011) and the higher number of collisions are expected to enhance the machine’s discovery potential considerably, opening new possibilities in the searches for new and heavier particles.

100 years of cosmic rays

CCnew2_06_12

On 7 August 1912, Victor Hess took a now famous balloon flight in which he observed a “clearly perceptible rise in radiation with increasing height” and concluded that “radiation of very high penetrating power enters our atmosphere from above”.

This issue of the CERN Courier marks this discovery of cosmic rays with a look at cosmic-ray research in the past as well as at its future directions.

The experiments – and the results – have always been challenging, as a look at those before Hess shows (Domenico Pacini and the origin of cosmic rays). Nevertheless, they led to new techniques, such as the detection of Cherenkov radiation produced in the atmosphere (The discovery of air-Cherenkov radiation), now fundamental for high-energy gamma-ray astronomy (Cherenkov Telescope Array is set to open new windows). Large-scale experiments detect the highest-energy cosmic rays (Studies of ultra-high-energy cosmic rays look to the future) and have their sights on cosmic neutrinos (A neutrino telescope deep in the Mediterranean Sea) in a quest to discover the cosmic accelerators that surpass the highest energies attained in the laboratory. Meanwhile, the LHC contributes with useful data (LHCf: bringing cosmic collisions down to Earth) and some intriguing results (ALICE looks to the skies).

A surprising asymmetry and more excited states

The flavour-changing neutral-current decays B → K(K*+μ provide important channels in searching for new physics, as they are highly suppressed in the Standard Model. The predictions in these channels suffer from relatively large theoretical uncertainties but these can be overcome by measuring asymmetries in which the uncertainties cancel. One example is the isospin asymmetry AI, which compares the decays: B0 → K0(K*0+μ and B+ → K+(K*++μ. In the Standard Model, AI is predicted to be small, around –1%, for the decays to the excited K*, and while there is no precise prediction for the decays to the K, a similar value is expected.

CCnew4_06_12

LHCb has measured AI for these decays as a function of the dimuon mass (q2), using data corresponding to an integrated luminosity of 1.0 fb–1, with a surprising result. While the measurements for B → K*μ+μ are consistent with the prediction of negligible isospin asymmetry, the value for B → Kμ+μ is non zero. In particular, in the two q2 bins below 4.3 GeV/c2 and in the highest bin above 16 GeV/c2 the isospin asymmetry is negative in the B → Kμ+μ channel. These q2 regions are furthest from the charmonium regions and cleanly predicted theoretically. The measured asymmetry is dominated by the deficit observed in B0 → K0μ+μ. Integrated over the dimuon mass range, the result for AI deviates from zero by more than 4σ.

These results were obtained with the full data sample for 2011, which should more than double by the end of 2012. In the meantime, theorists will analyse this puzzling result to establish whether this effect can be accommodated in the framework of the Standard Model – or whether its explanation requires new physics.

In a different study, LHCb observed two Λb excited states for the first time, as predicted within the context of the quark model. The excited states (see figure) were reconstructed in three steps. First, Λc+ particles were reconstructed through their decay Λc+ → pKπ+; then the Λc particles were combined with π to look for Λb particles; finally the Λb particles were combined with π+π pairs. In this way the team found about 16 Λb(5912)→Λbπ+π decays (4.9σ significance) and about 50 Λb(5920)→Λbπ+π decays (10.1σ) among some 6 × 1013 proton–proton collisions detected during 2011.

Seeing bosons in heavy-ion collisions

Studies of heavy-ion collisions at the LHC are challenging and refining ideas on how to probe QCD – the theory of the strong interaction – at high temperature and density. From precision analyses of particle “flow” that clearly distinguish pre-collision effects from post-collision effects, to the observation of jet quenching, the ATLAS collaboration is releasing many new results. Several of these observations are surprising and unexpected, such as the occurrence of strong jet quenching with almost no jet broadening; and complete explanations are currently lacking. One new set of results, however, spectacularly confirms expectations: photons and the heavy W and Z bosons are unaffected by the hot dense QCD medium.

CCnew6_06_12

Direct measurements of energetic photon production released by the collaboration recently show that the number of photons produced is just as would be expected from ordinary proton–proton collisions when extrapolated to the multiple collisions within the heavy-ion interactions. This effect is truly independent of the “centrality” of the collision, the parameter that distinguishes head-on (central) collisions from grazing collisions. Similar observations have been made at much lower energies. However, by taking advantage not only of the LHC beam energy but also the capacity of the ATLAS calorimeters to make precision measurements and reject background events, this new study extends the results to energies 10 times higher for central collisions.

ATLAS has also released new measurements of Z-boson production, which show that, like photons, Zs are unaffected by the heavy-ion environment; the number produced is exactly what would be expected from “binary scaling”, i.e. scaling up to the number of nucleon collisions. The Z bosons were measured through their decays both to two muons, using the ATLAS muon spectrometer, and to electron–positron pairs, with the ATLAS calorimeters. The observation of binary scaling not only shows that the Zs are unaffected by the medium, but it reveals that the electrons, positrons and muons produced are also unaffected, as expected.

These results open up a long dreamt of possibility in this field: the study of jet-boson correlations. Because the bosons are unaffected by the hot dense medium, they can be used as a “control” to study precisely the suppression of jets. ATLAS is already making prototype measurements of this kind and high precision should be attainable in future LHC runs.

• For more information, see https://twiki.cern.ch/twiki/bin/view/AtlasPublic.

EXO, MINOS and OPERA reveal new results

CCnew7_06_12

The first results from the Enriched Xenon Observatory 200 (EXO-200) on the search for neutrinoless double beta decay show no evidence for this hypothesised process, which would shed new light on the nature of the neutrino. Located in the US Department of Energy’s Waste Isolation Pilot Plant in New Mexico, EXO-200 is a large beta-decay detector. In 2011 it was the first to measure two-neutrino double beta decay in 136Xe; now it has set a lower limit for neutrinoless double beta decay for the same isotope.

Double beta decay, first observed in 1986, occurs when a nucleus is energetically unable to decay via single beta decay, but can instead lose energy through the conversion of two neutrons to protons, with the emission of two electrons and two antineutrinos. The related process without the emission of antineutrinos is theoretically possible but only if the neutrino is a “Majorana” particle, i.e. it is its own antiparticle.

EXO-200 uses 200 kg of 136Xe to search for double beta decay. Xenon can be easily purified and reused, and it can be enriched in the 136Xe isotope using Russian centrifuges, which makes processing large quantities feasible. It also has a decay energy – Q-value – of 2.48 MeV, high enough to be above many of the uranium emission lines. Using 136Xe as a scintillator gives excellent energy resolution through the collection both of ionization electrons and of scintillation light. Finally, using xenon allows for complete background elimination through tagging of the daughter barium ion. This tagging, combined with the detector’s location more than 650 m underground and the use of materials selected and screened for radiopurity, ensures that other traces of radioactivity and cosmic radiation are eliminated or kept to a minimum. The latest results reflect this low background activity and high sensitivity – as only one event was recorded in the region where neutrinoless double beta decay was expected.

In the latest result, no signal for neutrinoless double beta decay was observed for an exposure of 32.5 kg/y, with a background of about 1.5 × 10–3 kg–1y–1keV–1. This sets a lower limit on the half-life of neutrinoless double beta decay in 136Xe to greater than 1.6 × 1025 y, corresponding to effective Majorana masses of less than 140–380 meV, depending on details of the calculation (Auger et al. 2012).

CCnew8_06_12

The EXO collaboration announced the results at Neutrino 2012, the 25th International Conference on Neutrino Physics and Astrophysics, held in Kyoto, on 3–9 June. This dedicated conference for the neutrino community provided the occasion for many neutrino experiments to publicize their latest results. In the case of the MINOS collaboration, these included the final results from the first phase of the experiment, which studies oscillations between neutrino types.

In 2010 the MINOS collaboration caused a stir when it announced the observation of a surprising difference between neutrinos and antineutrinos. Measurements of a key parameter used in the study of oscillations – Δm2, the difference in the squares of the masses of two oscillating types – gave different values for neutrinos and antineutrinos. In 2011, additional statistics brought the values closer together and, with twice as much antineutrino data collected since then, the gap has now closed. From a total exposure of 2.95 × 1020 protons on target, a value was found for muon antineutrinos of Δm2 = 2.62+0.31–0.28(stat.)±0.09(syst.) and the antineutrino “atmospheric” mixing angle was constrained with sin22θ greater than 0.75 at 90% confidence level (Adamson et al. 2012). These values are in agreement with those measured for muon neutrinos.

Since its debut in 2006, the OPERA experiment in the Gran Sasso National Laboratory has been searching for neutrino oscillations in which muon-neutrinos transform into τ-neutrinos as they travel the 730 km of rock between CERN, where they originate, and the laboratory in Italy. At the conference, the OPERA collaboration announced the observation of their second τ-neutrino, after the first observation two years ago. This new event is an important step towards the accomplishment of the final goal of the experiment.

Results on the time of flight of neutrinos from CERN to the Gran Sasso were also presented by CERN’s director for research and scientific computing, Sergio Bertolucci, on behalf of four experiments. All four – Borexino, ICARUS, LVD and OPERA – measure a neutrino time of flight that is consistent with the speed of light. The indications are that a measurement by OPERA announced last September can be attributed to a faulty element of the experiment’s fibre-optic timing system.

Elements 114 and 116 receive official names

IUPAC has officially approved the names “flerovium” (Fl) for the element with atomic number 114 and “livermorium” (Lv), for the one with atomic number 116. The names were proposed by the collaboration from the Joint Institute for Nuclear Research (JINR), Dubna, and the Lawrence Livermore National Laboratory in California, led by JINR’s Yuri Oganessian. Scientists from the two laboratories share the priority for the discovery of these new elements at the facilities in Dubna.

The name flerovium is in honour of the Flerov Laboratory of Nuclear Reactions, where these superheavy elements were synthesized. Georgy Flerov (1913–1990) was a pioneer in heavy-ion physics and founder of the JINR Laboratory of Nuclear Reactions in 1957, which has borne his name since 1991. Flerov is also known for his fundamental work in fields of physics that resulted in the discovery of new phenomena in properties and interactions of atomic nuclei.

The name livermorium honours the Lawrence Livermore National Laboratory. A group of researchers from Livermore took part in the work carried out in Dubna on the synthesis of superheavy elements, including element 116. Over the years, researchers at the laboratory have been involved in many areas of nuclear science and investigation of chemical properties of the heaviest elements.

The discoverers of flerovium and livermorium have submitted their claims for the discovery of further heavy elements, with atomic numbers 113, 115, 117 and 118 to the Joint Working Party of independent experts drawn from the International Union of Pure and Applied Chemistry (IUPAC) and the International Union of Pure and Applied Physics.

Lead collisions in the LHC top the bill in Cagliari

Hard Probes 2012 – the 5th International Conference on Hard and Electromagnetic Probes of Nuclear Collisions – took place in Cagliari on 27 May – 1 June. The most important topical meeting to focus on the study of hard processes in ultra-relativistic heavy-ion collisions, this was the first time that the LHC collaborations presented their results based on lead–lead data. The main focus was undoubtedly on the wealth of new high-quality results from ALICE, ATLAS and CMS, complemented with significant contributions from the PHENIX and STAR experiments at the Relativistic Heavy Ion Collider (RHIC) in Brookhaven.

Quoting from the inspired opening talk given by Berndt Mueller of Duke University, the hard probes “manifesto” can be summarized as follows: hard probes are essential to resolve and study a medium of deconfined quarks and gluons at short spatial scales, and they have to be developed into as precise a tool as possible. This is accomplished by the study of the production and the propagation in the deconfined medium of heavy quarks, particles with high momentum transfer (pT), jets and quarkonia.

Jet quenching can be addressed by studying the suppression of leading hadrons in nuclear collisions with respect to the proton–proton case. The ALICE and CMS collaborations reported results on the production of open charm and beauty, and results were also presented from the STAR experiment. An important aspect of parton energy-loss in the medium is its mass dependence: the energy loss is expected to be strongest for light hadrons and smaller for heavy quarks. The LHC data shown at the conference are suggestive of such hierarchy, although more statistics are still needed to reach a firm conclusion.

In addition, the high-precision LHC data on light charged hadrons are significantly expanding the kinematic reach. This is fundamental to discriminating among theoretical models, which have been tuned at the lower energy of RHIC.

At the LHC, full reconstruction of high-energy jets has become possible for the first time, allowing ATLAS and CMS to present high-statistics results on jet–jet correlations. The emerging picture is consistent with one in which partons lose a large fraction of their energy while traversing the hot QCD medium – before fragmenting essentially in vacuum. First results on γ-jet correlations were also presented by the CMS and PHENIX collaborations; these allow the tagging of quark jets and give a better estimation of the initial parton energy. During the conference, an intense debate developed on how to exploit fully the information provided by full jet reconstruction.

Quarkonia suppression was another of the striking observables, for which results from LHC had been eagerly awaited. CMS presented the first exciting precision results on the suppression of the ϒ states. These reveal a clear indication of a much larger suppression for more weakly bound ϒ(2S) and ϒ(3S) with respect to the strongly bound ϒ(1S) states, in accordance with the predictions for the observation of colour screening. The ALICE collaboration presented new data on the rapidity and pT dependence of J/ψ suppression. The results show that, despite the higher initial temperatures reached at LHC, the size of the suppression remains significantly smaller than at RHIC. This is an intriguing hint that a regeneration mechanism from the large number of charm quarks present in the deconfined medium may take place at LHC energies.

Part of the conference was devoted to the study of initial-state phenomena. In particular, at high energy peculiar features related to the saturation of the gluon phase-space should emerge, leading to a state called “colour glass condensate”. A discussion took place on the how the existence of this state could be proved or disproved at LHC. The study of initial-state phenomena also came under debate because of its importance in disentangling the effects of cold nuclear matter from genuine final-state effects in hot matter.

With the advent of high-precision data, theory is being increasingly challenged, since the understanding of the bulk properties of the medium produced in heavy-ion collisions is rapidly advancing. As several speakers discussed, significant advances are being made both in the understanding of the parton energy-loss mechanism and in the quarkonia production, for which a quantitative picture is emerging.

Still, as CERN’s Jürgen Schukraft pointed out in his summary talk, there is a need for measurements of even higher precision, as well as a wish list for new measurements: for example, in the heavy-flavour sector, lowering the pT reach to measure the total charm cross-section; and reconstructing charmed and beauty baryons to gain further insight into thermalization of the medium.

On a shorter time scale, the next crucial step is the measurement of effects in cold nuclear matter, which will be possible in the forthcoming proton–nucleus run at the LHC. Based on the experience from the past lower energy measurements, new surprises might be just behind the corner.

The conference was preceded by introductory student lectures covering aspects of quarkonia production and jet quenching. About 40 students were supported by the organization, thanks to generous contributions by several international laboratories (CERN, EMMI, INFN) and, in particular, by the University of Cagliari and by the government of Sardinia. The conference was broadcast to a wider audience worldwide as a webcast.

• For more information, see the conference website www.ca.infn.it/hp12/.

bright-rec iop pub iop-science physcis connect