Comsol -leaderboard other pages

Topics

An important day for science

CCvie1_07_12

On 4 July 2012, particle physics was headline news around the world thanks to a scientific success story that began over 60 years ago. It was a great day for science and a great day for humanity: a symbol of what people can achieve when countries pool their resources and work together, particularly when they do so over the long term.

This particular success story is called CERN, a European laboratory for fundamental research born from the ashes of the Second World War with support from all parties, in Europe and beyond. The headline news was the discovery of a particle consistent with the long-sought after Higgs boson, certainly a great moment for science. In the long term, however, the legacy of 4 July may well be that CERN’s global impact endorses the model established by the organization’s founding fathers in the 1950s and shows that it still sets the standard for scientific collaboration today. CERN’s success exemplifies what people can achieve if we keep sight of the vision that those pioneers had for a community of scientists united in diversity pursuing a common goal.

CERN is a European organization, founded on principles of fairness to its members and openness to the world. Accordingly, its governance model gives a fair voice to all member states, both large and small. Its funding model allows member states to contribute according to their means. Its research model welcomes scientists from around the world who are able to contribute positively to the laboratory’s research programmes. Through these basic principles, CERN’s founding fathers established a model of stability for cross-border collaboration in Europe, for co-ordinated European engagement with the rest of the world, and they laid down a blueprint for leadership in the field of particle physics. The result is that today, CERN is undisputedly the hub of a global community of scientists advancing the frontiers of knowledge. It is a shining example of what people can do together.

This fact has not been lost on other fields and over the years several European scientific organizations have emulated the CERN model. The European Space Agency (ESA) and European Southern Observatory (ESO), for example, followed CERN’s example and have also established themselves as leaders in their fields. Today, those thinking of future global science projects look to the CERN model for inspiration.

Scientific success stories like this are now more important then ever. At a time when the world is suffering the worst economic crisis in decades, people – particularly the young – need to see and appreciate the benefits of basic science and collaboration across borders. And at a time when science is increasingly estranged from a science-dependent society, it is important for good science stories to make the news and encourage people to look beyond the headlines. For these reasons, as well as the discovery itself, 4 July was an important day for science.

New results cast light on semileptonic Bs asymmetry

The LHCb experiment has made the most precise measurement to date of the asymmetry assl, which is a measure of a flavour-specific matter–antimatter asymmetry in B-mesons and a test for physics beyond the Standard Model.

In 2010, and with an update in 2011, the Fermilab DØ collaboration reported an asymmetry in the semileptonic decays of B mesons decay into muons, which they observed in the number of events containing same-sign dimuons. The most recent result, using almost the full DØ data sample of 9 fb–1, gives an asymmetry of about –1%, and differs by 3.9 σ from the tiny value predicted within the framework of the Standard Model (Abazov et al. 2011). If confirmed, it would indicate the presence of new physics.

CCnew8_07_12

Same-sign dimuons can be produced from the decay of pairs of neutral B mesons, which can mix between their particle and antiparticle states. Owing to the inclusive nature of the DØ measurement, the asymmetry, denoted Absl, is a sum of contributions from the individual asymmetries in the Bd and Bs meson systems, adsl and assl respectively. It is shown as the diagonal band in the plane of those asymmetries in the figure. The individual asymmetries characterize CP-violation in B-meson mixing, similar to the parameter εK in the neutral kaon system.

One of the highest priorities in flavour physics has been to measure adsl and assl separately to establish if there is a disagreement with the Standard Model – and, if so, whether it occurs in the Bd or Bs system. Previous measurements of adsl by the BaBar and Belle collaborations working at the ϒ(4S) resonance and of assl in an independent analysis by DØ have not been sufficiently precise to answer this question.

The new result from LHCb, based on the full 2011 data sample of 1.0 fb–1, and first presented at ICHEP2012 (p53), provides the most precise measurement to date of assl. The analysis uses B0s→Dsμ+X (and charge conjugate) decays, with Ds→φπ and relies on excellent control of asymmetries in the μ± trigger and reconstruction. The result, assl = (–0.24 ± 0.54 ± 0.33)%, which is shown as the horizontal blue band in the figure, is consistent with the Standard Model prediction (LHCb collaboration 2012). Updated results from DØ on both adsl and assl, which were also presented at ICHEP2012, continue to leave the situation unclear; more precise measurements are needed (Stone 2012). With the recently announced extension of proton running at the LHC for 2012, the LHCb collaboration expects to more than triple its data sample, so updates on this topic will be most exciting.

How the hippies saved physics: science, counterculture, and the quantum revival

By David Kaiser
W W Norton & Company
Hardback: £17.99 $26.95
Paperback: $17.95

CCboo3_06_12

In this curious book, David Kaiser presents a detailed “biography” of a group of young physicists, the “Fundamental Fysics Group”, based in Berkeley, California, and their unconventional impact on the development of “the new quantum age”. Most of the action takes place in the 1970s and includes a surprising mixture of characters and plots, as suitably summarized in these illuminating words: “Many of the ideas that now occupy the core of quantum information science once found their home amid an anything-goes counterculture frenzy, a mishmash of spoon-bending psychics, Eastern mysticism, LSD trips, CIA spooks chasing mind-reading dreams and comparable ‘Age of Aquarius’ enthusiasms.” These people regularly gathered to discuss all sorts of exotic topics, including telepathy and “remote viewing”, as well as faster-than-light communication and the fundamental concepts of quantum theory.

Among many other things, I liked learning about early discussions regarding Bell’s theorem, the Einstein-Podolsky-Rosen paradox and the nature of reality, sometimes taking place in workshops with sessions in hot baths, interspersed by drum playing and yoga exercises. I also enjoyed reading about the first experimental tests of Bell’s work by John Clauser and about the genesis of the bestseller The Tao of Physics, by Fritjof Capra. It was particularly interesting to learn about a paper on superluminal communication (published despite negative reports from referees), which triggered the development of rebuttal arguments that ended up being quite revolutionary and leading to quantum encryption etc. It was thinking outside the “establishment” way that led to a wrong but fruitful idea about implications of Bell’s theorem, which forced others to improve the understanding of quantum entanglement and gave rise to a new and highly successful branch of physics: quantum information. Kaiser’s basic message is that, sometimes, crazy ideas push the understanding of science beyond the frontiers set by people working in conventional environments, within universities, and by government grants.

I know that we should not judge a book by its cover but with such a title I expected this book to be an interesting summertime read and was surprised to find that it is written in a rather heavy style that is more suitable for historians of science than for physicists relaxing on the beach. The topic of the book is actually quite curious, the language is fluid and the narrative is well presented but the level of detail is such that many readers will often feel like jumping ahead. It is elucidating to note that almost 25% of the book’s 400 pages are devoted to listings of notes and of bibliography. Essentially every sentence, every paragraph, is justified by an “end note”, which is an overkill for a book targeting a general audience. Writing this dense book must have been a long-term job for Kaiser, who is both a physicist and a historian. The result does not really qualify as an easy read. I enjoy reading biographies if they have a nice rhythm, some suspense and a few anecdotes here and there – which is not exactly the case for this book. I wonder how many readers end up moving it aside after realizing that they have been misled by the spirited title?

Powering the Future: How We Will (Eventually) Solve the Energy Crisis and Fuel the Civilization of Tomorrow

By Robert Laughlin
Basic Books
Hardback: £17.99 $24.99

CCboo1_06_12

Nearly 90% of the world’s economy is driven by the massive use of fossil fuels. The US spends one-sixth of its gross domestic product on oil alone, without counting the important costs of coal and natural gas, even though its use of oil and the other fossil fuels has progressively decreased since the mid-1970s. While the debate on fossil fuels continues to rage on both sides of the Atlantic, Robert Laughlin, professor of physics at Stanford University and Nobel Laureate for the fractional Hall effect, has written Powering the Future – a hypothetical voyage through the future, where the human race will have demands and expectations similar to those of today but where technologies will probably be quite different.

The book is essentially one of two halves. The first half contains the main chapters, where all of the essential statements and the logical lines of the various arguments are developed with an informal style. These are then complemented by the second half, which consists of a delightful set of notes. The notes encourage readers to form their own opinions on specific subjects using a number of tools, which range from assorted references to simplified quantitative estimates.

Treatises on energy problems that are written by political scientists are often scientifically inaccurate; specialized monographs are sometimes excessively technical. This book uses an intermediate register where the quantitative aspects of a problem are discussed but the overall presentation is not pedantic. Of the numerous examples, here are two short ones. What is the total precipitation that falls in one year on the world? The answer is “one metre of rain, the height of a golden retriever” (page 7 and note on page 127). What is the power-carrying capacity for the highest voltage currently used in North America? The answer is “2 billion watts” (page 46 and note on page 156) and is derived with simple mathematical tools.

Laughlin’s chain of arguments forms a composite approach to the energy challenge, where fossil fuels will still be needed 200 years from now to fly aeroplanes. Nuclear power plants will inevitably (but cautiously) be exploited and solar energy will offer decisive solutions in limited environments (see chapter nine, “Viva Las Vegas!”). While the author acknowledges that market forces (and not green technology) will be the future driver of energy innovation, the book does not explicitly support any partisan cause but tries to inspect thoroughly the issues at stake.

A few tweets may not suffice to develop informed views on the energy future of the human race. On the other hand, Powering the Future will certainly stimulate many readers (including, I hope, physicists) to form their own judgements and to challenge some of the canned statements that proliferate on the internet these days.

The History of Mathematics: A Very Short Introduction

By Jacqueline Stedall
Oxford University Press
Paperback: £7.99 $11.95

CCboo2_06_12

What a wonderful surprise. I was going to review another book before this one but it wasn’t to my liking (actually it was pretty bad) and I gave up after the first few chapters. So I settled instead on this book, mainly because it is short, or “very short” as the subtitle suggests.

Seeing that it was part of a series, I was expecting a typical history starting with Pythagoras and Euclid, then Newton and possibly Leibniz, Euler, Gauss and Riemann, followed by a collection of moderns, depending on how much space was left. I looked in the (excellent) index at the back (opening Q–Z) and was surprised to find no entry for Riemann. Was this British bias? No, Hardy was missing as well – but instead there were other people who I’d never heard of: William Oughtred, for example, (author of the first maths book published in Oxford) and Etienne d’Espagnet (who supplied Fermat with essential earlier works). Samuel Pepys also makes an appearance but more as an example of how little maths educated people knew in the 17th century.

I learnt in this charming book that what I had been expecting is called the “stepping stone” approach to the history of mathematics, focusing on elite mathematicians. This book is refreshingly different. It is actually more about the subject “history of mathematics”, i.e. about how we compile and recount a history of mathematics rather than about a sequence of events. However, it does this by focusing on intriguing stories that show the various features that must be considered. In doing so, it fills in the water between the stepping stones, for example, in the story of Fermat’s last theorem. It also tells the story of the majority of people who actually do maths – schoolchildren – by discussing the class work in a Babylonian classroom (around 1850 BC), as well as in a Cumbrian classroom around 1800.

After reading this “preview version”, I am now going to get the “director’s cut” – The Oxford Handbook of the History of Mathematics, which is co-authored by the same author with Eleanor Robson.

Happy reading and exploring!

LHC delivers for the summer conferences

CCnew1_06_12

With more luminosity delivered by the LHC between April and June 2012 than in the whole of 2011, the experiments had just what the collaborations wanted: as much data as possible before the summer conferences. By the time that a six-day period of machine development began on 18 June, the integrated luminosity for 2012 had reached about 6.6 fb–1, compared with around 5.6 fb–1 delivered in 2011.

The LHC’s performance over the preceding week had become so efficient that the injection kicker magnets – which heat up while beams continue to pass through them as they circulate – did not have time to cool down between fills. The kickers lose their magnetic properties when the ferrites at their centres become too hot, so on some occasions a few hours of cool-down time had to be included before beam for the next fill could be injected.

As the time constants for warming up and cooling down are both of the order of many hours, the temperature of the magnets turns out to provide a good indicator of the LHC’s running efficiency. The record for luminosity production of more than 1.3 fb–1 in a single week corresponds well with the highest measured kicker-magnet temperature of 70°C. A programme is now under way to reduce further the beam impedance of the injection kickers, which should substantially reduce the heating effect in future.

Routine operation of the LHC for physics is set to continue over the summer, with the machine operating with 1380 proton bunches in each beam – the maximum value for this year – and around 1.5 × 1011 protons a bunch. The higher beam energy of 4 TeV (compared with 3.5 TeV in 2011) and the higher number of collisions are expected to enhance the machine’s discovery potential considerably, opening new possibilities in the searches for new and heavier particles.

100 years of cosmic rays

CCnew2_06_12

On 7 August 1912, Victor Hess took a now famous balloon flight in which he observed a “clearly perceptible rise in radiation with increasing height” and concluded that “radiation of very high penetrating power enters our atmosphere from above”.

This issue of the CERN Courier marks this discovery of cosmic rays with a look at cosmic-ray research in the past as well as at its future directions.

The experiments – and the results – have always been challenging, as a look at those before Hess shows (Domenico Pacini and the origin of cosmic rays). Nevertheless, they led to new techniques, such as the detection of Cherenkov radiation produced in the atmosphere (The discovery of air-Cherenkov radiation), now fundamental for high-energy gamma-ray astronomy (Cherenkov Telescope Array is set to open new windows). Large-scale experiments detect the highest-energy cosmic rays (Studies of ultra-high-energy cosmic rays look to the future) and have their sights on cosmic neutrinos (A neutrino telescope deep in the Mediterranean Sea) in a quest to discover the cosmic accelerators that surpass the highest energies attained in the laboratory. Meanwhile, the LHC contributes with useful data (LHCf: bringing cosmic collisions down to Earth) and some intriguing results (ALICE looks to the skies).

A surprising asymmetry and more excited states

The flavour-changing neutral-current decays B → K(K*+μ provide important channels in searching for new physics, as they are highly suppressed in the Standard Model. The predictions in these channels suffer from relatively large theoretical uncertainties but these can be overcome by measuring asymmetries in which the uncertainties cancel. One example is the isospin asymmetry AI, which compares the decays: B0 → K0(K*0+μ and B+ → K+(K*++μ. In the Standard Model, AI is predicted to be small, around –1%, for the decays to the excited K*, and while there is no precise prediction for the decays to the K, a similar value is expected.

CCnew4_06_12

LHCb has measured AI for these decays as a function of the dimuon mass (q2), using data corresponding to an integrated luminosity of 1.0 fb–1, with a surprising result. While the measurements for B → K*μ+μ are consistent with the prediction of negligible isospin asymmetry, the value for B → Kμ+μ is non zero. In particular, in the two q2 bins below 4.3 GeV/c2 and in the highest bin above 16 GeV/c2 the isospin asymmetry is negative in the B → Kμ+μ channel. These q2 regions are furthest from the charmonium regions and cleanly predicted theoretically. The measured asymmetry is dominated by the deficit observed in B0 → K0μ+μ. Integrated over the dimuon mass range, the result for AI deviates from zero by more than 4σ.

These results were obtained with the full data sample for 2011, which should more than double by the end of 2012. In the meantime, theorists will analyse this puzzling result to establish whether this effect can be accommodated in the framework of the Standard Model – or whether its explanation requires new physics.

In a different study, LHCb observed two Λb excited states for the first time, as predicted within the context of the quark model. The excited states (see figure) were reconstructed in three steps. First, Λc+ particles were reconstructed through their decay Λc+ → pKπ+; then the Λc particles were combined with π to look for Λb particles; finally the Λb particles were combined with π+π pairs. In this way the team found about 16 Λb(5912)→Λbπ+π decays (4.9σ significance) and about 50 Λb(5920)→Λbπ+π decays (10.1σ) among some 6 × 1013 proton–proton collisions detected during 2011.

Seeing bosons in heavy-ion collisions

Studies of heavy-ion collisions at the LHC are challenging and refining ideas on how to probe QCD – the theory of the strong interaction – at high temperature and density. From precision analyses of particle “flow” that clearly distinguish pre-collision effects from post-collision effects, to the observation of jet quenching, the ATLAS collaboration is releasing many new results. Several of these observations are surprising and unexpected, such as the occurrence of strong jet quenching with almost no jet broadening; and complete explanations are currently lacking. One new set of results, however, spectacularly confirms expectations: photons and the heavy W and Z bosons are unaffected by the hot dense QCD medium.

CCnew6_06_12

Direct measurements of energetic photon production released by the collaboration recently show that the number of photons produced is just as would be expected from ordinary proton–proton collisions when extrapolated to the multiple collisions within the heavy-ion interactions. This effect is truly independent of the “centrality” of the collision, the parameter that distinguishes head-on (central) collisions from grazing collisions. Similar observations have been made at much lower energies. However, by taking advantage not only of the LHC beam energy but also the capacity of the ATLAS calorimeters to make precision measurements and reject background events, this new study extends the results to energies 10 times higher for central collisions.

ATLAS has also released new measurements of Z-boson production, which show that, like photons, Zs are unaffected by the heavy-ion environment; the number produced is exactly what would be expected from “binary scaling”, i.e. scaling up to the number of nucleon collisions. The Z bosons were measured through their decays both to two muons, using the ATLAS muon spectrometer, and to electron–positron pairs, with the ATLAS calorimeters. The observation of binary scaling not only shows that the Zs are unaffected by the medium, but it reveals that the electrons, positrons and muons produced are also unaffected, as expected.

These results open up a long dreamt of possibility in this field: the study of jet-boson correlations. Because the bosons are unaffected by the hot dense medium, they can be used as a “control” to study precisely the suppression of jets. ATLAS is already making prototype measurements of this kind and high precision should be attainable in future LHC runs.

• For more information, see https://twiki.cern.ch/twiki/bin/view/AtlasPublic.

EXO, MINOS and OPERA reveal new results

CCnew7_06_12

The first results from the Enriched Xenon Observatory 200 (EXO-200) on the search for neutrinoless double beta decay show no evidence for this hypothesised process, which would shed new light on the nature of the neutrino. Located in the US Department of Energy’s Waste Isolation Pilot Plant in New Mexico, EXO-200 is a large beta-decay detector. In 2011 it was the first to measure two-neutrino double beta decay in 136Xe; now it has set a lower limit for neutrinoless double beta decay for the same isotope.

Double beta decay, first observed in 1986, occurs when a nucleus is energetically unable to decay via single beta decay, but can instead lose energy through the conversion of two neutrons to protons, with the emission of two electrons and two antineutrinos. The related process without the emission of antineutrinos is theoretically possible but only if the neutrino is a “Majorana” particle, i.e. it is its own antiparticle.

EXO-200 uses 200 kg of 136Xe to search for double beta decay. Xenon can be easily purified and reused, and it can be enriched in the 136Xe isotope using Russian centrifuges, which makes processing large quantities feasible. It also has a decay energy – Q-value – of 2.48 MeV, high enough to be above many of the uranium emission lines. Using 136Xe as a scintillator gives excellent energy resolution through the collection both of ionization electrons and of scintillation light. Finally, using xenon allows for complete background elimination through tagging of the daughter barium ion. This tagging, combined with the detector’s location more than 650 m underground and the use of materials selected and screened for radiopurity, ensures that other traces of radioactivity and cosmic radiation are eliminated or kept to a minimum. The latest results reflect this low background activity and high sensitivity – as only one event was recorded in the region where neutrinoless double beta decay was expected.

In the latest result, no signal for neutrinoless double beta decay was observed for an exposure of 32.5 kg/y, with a background of about 1.5 × 10–3 kg–1y–1keV–1. This sets a lower limit on the half-life of neutrinoless double beta decay in 136Xe to greater than 1.6 × 1025 y, corresponding to effective Majorana masses of less than 140–380 meV, depending on details of the calculation (Auger et al. 2012).

CCnew8_06_12

The EXO collaboration announced the results at Neutrino 2012, the 25th International Conference on Neutrino Physics and Astrophysics, held in Kyoto, on 3–9 June. This dedicated conference for the neutrino community provided the occasion for many neutrino experiments to publicize their latest results. In the case of the MINOS collaboration, these included the final results from the first phase of the experiment, which studies oscillations between neutrino types.

In 2010 the MINOS collaboration caused a stir when it announced the observation of a surprising difference between neutrinos and antineutrinos. Measurements of a key parameter used in the study of oscillations – Δm2, the difference in the squares of the masses of two oscillating types – gave different values for neutrinos and antineutrinos. In 2011, additional statistics brought the values closer together and, with twice as much antineutrino data collected since then, the gap has now closed. From a total exposure of 2.95 × 1020 protons on target, a value was found for muon antineutrinos of Δm2 = 2.62+0.31–0.28(stat.)±0.09(syst.) and the antineutrino “atmospheric” mixing angle was constrained with sin22θ greater than 0.75 at 90% confidence level (Adamson et al. 2012). These values are in agreement with those measured for muon neutrinos.

Since its debut in 2006, the OPERA experiment in the Gran Sasso National Laboratory has been searching for neutrino oscillations in which muon-neutrinos transform into τ-neutrinos as they travel the 730 km of rock between CERN, where they originate, and the laboratory in Italy. At the conference, the OPERA collaboration announced the observation of their second τ-neutrino, after the first observation two years ago. This new event is an important step towards the accomplishment of the final goal of the experiment.

Results on the time of flight of neutrinos from CERN to the Gran Sasso were also presented by CERN’s director for research and scientific computing, Sergio Bertolucci, on behalf of four experiments. All four – Borexino, ICARUS, LVD and OPERA – measure a neutrino time of flight that is consistent with the speed of light. The indications are that a measurement by OPERA announced last September can be attributed to a faulty element of the experiment’s fibre-optic timing system.

bright-rec iop pub iop-science physcis connect