Comsol -leaderboard other pages

Topics

A First Course in Mathematical Physics

By Colm T Whelan
Wiley-VCH

51gZGHnDTiL._SX346_BO1,204,203,200_

The aim of this book is to provide undergraduate students taking classes in the physical sciences with the fundamental-mathematics tools they need to proceed with their studies.

In the first part the author introduces core mathematics, starting from basic concepts such as functions of one variable and complex numbers, and moving to more advanced topics including vector spaces, fields and operators, and functions of a complex variable.

The second part shows some of the copious applications of these mathematics tools to physics. When introducing complex physics laws and theories, including Maxwell’s equations, special relativity and quantum theory, the author tries to present the material in an easily intelligible way. The author also emphasises the direct connection between the conceptual basis of these physics topics and the mathematical tools provided in the first part of the text.

Two appendices of formulas conclude the book. A large number of problems are included but the solutions are only made available on a password-protected website for lecturers.

Thorium Energy for the World

By J-P Revol, M Bourquin, Y Kadi, E Lillestol, J-C de Mestral and K Samec (eds)
Springer

978-3-319-26542-1

This book contains the proceedings of the Thorium Energy Conference (ThEC13), held in October 2013 at CERN, which brought together some of the world’s leading experts on thorium technologies. According to them, nuclear energy based on a thorium fuel cycle is safer and cleaner than the one generated from uranium. In addition, long-lived waste from existing power plants could be retrieved and integrated into the thorium fuel cycle to be transformed into a stable material while generating electricity.

The technology required to implement this type of fuel cycle is already being developed, nevertheless much effort and time is still needed.

The ThEC13 conference saw the participation of high-level speakers from 30 countries, such as the Nobel prize laureates Carlo Rubbia and Jack Steinberger, the then CERN Director-General Rolf Heuer, and Hans Blix, former director-general of the International Atomic Energy Agency (IAEA), to name a few.

Collecting the contributions of the speakers, this book offers a detailed technical review of thorium-energy technologies from basic R&D to industrial developments, and is thus a tool for informed debates on the future of energy production and, in particular, on the advantages and disadvantages of different nuclear technologies.

Bose–Einstein Condensation and Superfluidity

By L Pitaevskii and S Stringari
Oxford University Press

CCboo2_05_17

This book deals with the fascinating topics of Bose–Einstein condensation (BEC) and superfluidity. The main emphasis is on providing the formalism to describe these phases of matter as observed in the laboratory. This is far from the idealised studies that originally predicted BEC and are essential to interpret the experimental observations.

BEC was predicted in 1925 by Einstein, based on the ideas of Satyendra Nath Bose. It corresponds to a new phase of matter where bosons accumulate at the lowest energy level and develop coherent quantum properties at a macroscopic scale. These properties may correspond to phenomena that seem impossible from an everyday perspective. In particular, BEC lies behind the theory of superfluids, which are fluids that flow without dissipating energy and rotate without generating vorticity – if we except quantised vortices, which are a sort of topological defect.

Experimentally, the first BEC from dilute gases was observed in the laboratory in 1995, recognised by the 2001 Nobel Prize in Physics. Since then, there has been an explosion of interest and new results in the field. It is thus timely that two of its leading experts have updated and extended their volume on BEC to summarise the theoretical aspects of this phase of matter. The authors also describe in detail how superfluid phenomena can occur for Fermi gases in the presence of interactions.

The book is relatively heavy in formalism, which is justified by the wide range of phenomena covered in a relatively concise volume. It starts with some basics about correlation functions, condensation and statistical mechanics. Next, it delves into the simplest systems for which BEC can occur: weakly coupled dilute gases of bosonic particles. The authors describe different approaches to the BEC phase, including the works of Landau and Bogoliubov. They also introduce the Gross–Pitaevskii equation and show its importance in the description of superfluids. Superfluidity is explained in great detail, in particular the occurrence of quantised vortices.

The second part describes how to adapt the theoretical formalism introduced in the first part to realistic traps where BEC is observed. This is very important to connect theoretical descriptions to laboratory research, for instance to predict in which experimental configurations a BEC will appear and how to characterise it.

Part three deals with BEC in fermionic systems, which is possible if the fermions interact and pair-up into bosonic structures. These fermionic phases exhibit superfluid properties and have been created in the laboratory, and the authors consider fermionic condensates in realistic traps. The final part is devoted to new phenomena appearing in mixed bosonic–fermionic systems.

The book is a good resource for the theoretical description of BEC beyond the idealised configurations that are described in many texts. The concise style and large amount of notation requires constant effort from the reader, but seems inevitable to explain many of the surprising phenomena appearing in BECs. The book, perhaps combined with others, will provide the reader with a clear overview of the topic and latest theoretical developments in the field. The text is enhanced by the many figures and plots presenting experimental data.

Making Sense of Quantum Mechanics

By Jean Bricmont
Springer

CCboo1_05_17

In this book, Jean Bricmont aims to challenge Richard Feynman’s famous statement that “nobody understands quantum mechanics” and discusses some of the issues that have surrounded this field of theoretical physics since its inception.

Bricmont starts by strongly criticising the “establishment” view of quantum mechanics (QM), known as the Copenhagen interpretation, which attributes a key role to the observer in a quantum measurement. The quantum-mechanical wavefunction, indeed, predicts the possible outcomes of a quantum measurement, but not which one of these actually occurs. The author opposes the idea that a conscious human mind is an essential part of the process of determining what outcome is obtained. This interpretation was proposed by some of the early thinkers on the subject, although I believe Bricmont is wrong to associate it with Niels Bohr, who relates the measurement with irreversible changes in the measuring apparatus, rather than in the mind of the human observer.

The second chapter deals with the nature of the quantum state, illustrated with discussions of the Stern–Gerlach experiment to measure spin and the Mach–Zender interferometer to emphasise the importance of interference. During the last 20 years or so, much work has been done on “decoherence”. This has shown that the interaction of the quantum system with its environment, which may include the measuring apparatus, prevents any detectable interference between the states associated with different possible measurement outcomes. Bricmont correctly emphasises that this still does not result in a particular outcome being realised.

The author’s central argument is presented in chapter five, where he discusses the de Broglie–Bohm hidden-variable theory. At its simplest, it proposes that there are two components to the quantum-mechanical state: the wavefunction and an actual point particle that always has a definite position, although this is hidden from observation until its position is measured. This model claims to resolve many of the conceptual problems thrown up by orthodox QM: in particular, the outcome of a measurement is determined by the position of the particle being measured, while the other possibilities implied by the wavefunction can be ignored because they are associated with “empty waves”. Bricmont shows how all the results of standard QM – particularly the statistical probabilities of different measurement outcomes – are faithfully reproduced by the de Broglie–Bohm theory.

This is probably the clearest account of this theorem that I have come across. So why is the de Broglie–Bohm theory not generally accepted as the correct way to understand quantum physics? One reason follows from the work of John Bell, who showed that no hidden-variable theory can reproduce the quantum predictions (now thoroughly verified by experiment) for systems consisting of two or more particles in an entangled state unless the theory includes non-locality – i.e. a faster-than-light communication between the component particles and/or their associated wavefunctions. As this is clearly inconsistent with special relativity, many thinkers (including Bell himself) have looked elsewhere for a realistic interpretation of quantum phenomena. Not so Jean Bricmont: along with other contemporary supporters of the de Broglie–Bohm theory, he embraces non-locality and looks to use the idea to enhance our understanding of the reality he believes underlies quantum physics. In fact he devotes a whole chapter to this topic and claims that non-locality is an essential feature of quantum physics and not just of models based on hidden variables.

Other problems with the de Broglie–Bohm theory are discussed and resolved – to the author’s satisfaction at least. These include how the de Broglie–Bohm model can be consistent with the Heisenberg uncertainty principle when it appears to assume that the particle always has a definite position and momentum; he points out that the statistical results of a large number of measurements always agree with conventional predictions, and these include the uncertainty principle.

Alternative ways to interpret QM are presented, but the author does not find in them the same advantages as in the de Broglie–Bohm theory. In particular, he discusses the many-worlds interpretation, which assumes that the only reality is the wavefunction and that, rather than collapsing at a measurement, this produces branches that correspond to all measurement outcomes. One of the consequences of decoherence is that there can be no interaction between the possible measurements, and this means that no branch can be aware of any other. It follows that, even if a human observer is involved, each branch can contain a copy of him or her who is unaware of the others’ presence. From this point of view, all the possible measurement outcomes co-exist – hence the term “many worlds”. Apart from its ontological extravagance, the main difficulty with many-worlds theory is that it is very hard to see how the separate outcomes can have different probabilities when they all occur simultaneously. Many-worlds supporters have proposed solutions to this problem, which do not satisfy Bricmont (and, indeed, myself), who emphasises that this is not a problem for the de Broglie–Bohm theory.

A chapter is also dedicated to a brief discussion of philosophy, concentrating on the concept of realism and how it contrasts with idealism. Unsurprisingly, it concludes that realists want a theory describing what happens at the micro scale that accounts for predictions made at the macro scale – and that de Broglie–Bohm provide just such a theory.

The book concludes with an interesting account of the history of QM, including the famous Bohr–Einstein debate, the struggle of de Broglie and Bohm for recognition, and the influence of the politics of the time.

This is a clearly written and interesting book. It has been very well researched, containing more than 500 references, and I would thoroughly recommend it to anyone who has an undergraduate knowledge of physics and mathematics and an interest in foundational questions. Whether it actually lives up to its title is for each reader to judge.

LHC back with a splash

On 29 April, just after 8.00 p.m., the Large Hadron Collider (LHC) began circulating beams of protons for the first time this year. Extensive technical and maintenance work was undertaken since its end-of-year shutdown in early December, yet the restart of the 27 km-circumference superconducting collider has proceeded smoothly.

Magnet powering tests, which ensured the machine can be operated at an energy of 6.5 TeV per beam, were completed during the last week of April. This was followed by the machine-checkout phase, during which all equipment is placed in its operational state and the four LHC experiment caverns are patrolled and closed.

In the meantime, the crew of the Super Proton Synchrotron (SPS), which feeds protons to the LHC, worked hard to extract the single-bunch beam so that the LHC could be commissioned with beam. By the end of the afternoon on Friday 28 April, protons had been sent successfully down both transfer lines and were knocking at the LHC’s door. The following day, at 6.00 p.m., beam 1 (clockwise direction) was injected and threaded through the LHC’s eight sectors one at a time, circulating the entire machine after a period of 45 minutes. Beam 2 (anticlockwise direction) then went through the same process, and at 8.12 p.m. both beams were circulating. On Sunday 30 April, the single-bunch, low-intensity beams were successfully ramped to an energy of 6.5 TeV.

The next task, which was well under way as the Courier went to press, was to continue with detailed setting up of the machine while stepping up to higher bunch intensities and then multiple bunches. Each step in the intensity ramp up that follows has to be validated by circulating the beams from three fills for up to 20 hours, and the team is aiming for a configuration of 2550 bunches per beam, with each bunch containing of the order 1.2 × 1011 protons. Once stable beams  have been declared, expected in the second half of May, the beams will be brought into collision and the second chapter of LHC Run 2 will be under way.

Workshop puts advanced accelerators on track

Researchers working on advanced and novel accelerator technologies met at CERN on 25–28 April to draw up an international roadmap for future high-energy particle accelerators. Organised by the International Committee for Future Accelerators (ICFA) to trigger a community effort, the Advanced and Novel Accelerator for High Energy Physics Roadmap Workshop saw 80 experts from 11 countries discuss the steps needed to include these new technologies in strategies for future machines in Asia, Europe and the US. It follows recent discussions of national roadmaps in the US and elsewhere, and was timed so that advanced accelerator development can be taken into account in the coming update of the European Strategy for Particle Physics in 2020.

Given the scale of the cost of traditional accelerator technologies, which require large circular or long linear accelerators to reach the highest energies, the past two decades have seen significant progress to find alternative approaches. These include dielectrics and plasmas driven by laser pulses or particle beams, which are able to accelerate particles 1000 times more than the radio-frequency structures used in todayʼs accelerators. Major laboratories including CERN, SLAC, Argonne, DESY and INFN-Frascati are working on various techniques. CERN has recently started the AWAKE experiment, demonstrating that high-energy protons from the SPS can drive large accelerating fields in a plasma.

The next step is to apply these methods to high-energy physics. For example, the acceleration schemes must be tuned to determine their real potential for producing high-energy and high-quality particle bunches; the former has been demonstrated, but the latter remains a challenge. This requires experimental facilities that can only be hosted by international laboratories and a strong, united and co-ordinated community that merges the advanced and traditional accelerator communities.

The April event has now set this process in motion, focusing on the technical milestones that are needed to progress towards an intermediate-size particle accelerator and on the strategies needed to bring communities together. A new working group dedicated to the development of a roadmap will be included in the European Advanced Accelerators Concept Workshop in September 2017 in Elba, Italy.

“This is the first time that the advanced accelerator field is co-ordinated at the international level, and will pull the community together towards the first great challenge ahead, i.e. the achievement of reliable and high-quality particle bunches,” says workshop chair Brigitte Cros. “Further workshops will be organised to strengthen and sustain this co-ordination.”

CAST experiment constrains solar axions

In a paper published in Nature Physics, the CERN Axion Solar Telescope (CAST) has reported important new exclusion limits on coupling of axions to photons. Axions are hypothetical particles that interact very weakly with ordinary matter and therefore are candidates to explain dark matter. They were postulated decades ago to solve the “strong CP” problem in the Standard Model (SM), which concerns an unexpected time-reversal symmetry of the nuclear forces. Axion-like particles, unrelated to the strong-CP problem but still viable dark-matter candidates, are also predicted by several theories of physics beyond the SM, notably string theory.

A variety of Earth- and space-based observatories are searching possible locations where axions could be produced, ranging from the inner Earth to the galactic centre and right back to the Big Bang. CAST looks for solar axions using a “helioscope” constructed from a test magnet originally built for the Large Hadron Collider. The 10 m-long superconducting magnet acts like a viewing tube and is pointed directly at the Sun: solar axions entering the tube would be converted by its strong magnetic field into X-ray photons, which would be detected at either end of the magnet. Starting in 2003, the CAST helioscope, mounted on a movable platform and aligned with the Sun with a precision of about 1/100th of a degree, has tracked the movement of the Sun for an hour and a half at dawn and an hour and a half at dusk, over several months each year.

In the latest work, based on data recorded between 2012 and 2015, CAST finds no evidence for solar axions. This has allowed the collaboration to set the best limits to date on the strength of the coupling between axions and photons for all possible axion masses to which CAST is sensitive. The limits concern a part of the axion parameter space that is still favoured by current theoretical predictions and is very difficult to explore experimentally, allowing CAST to encroach on more restrictive constraints set by astrophysical observations. “Even though we have not been able to observe the ubiquitous axion yet, CAST has surpassed even the sensitivity originally expected, thanks to CERN’s support and unrelenting work by CASTers,” says CAST spokesperson Konstantin Zioutas. “CAST’s results are still a point of reference in our field.”

The experience gained by CAST over the past 15 years will help physicists to define the detection technologies suitable for a proposed, much larger, next-generation axion helioscope called IAXO. Since 2015, CAST has also broadened its research at the low-energy frontier to include searches for dark-matter axions and candidates for dark energy, such as solar chameleons.

Belle II rolls in

On 11 April, the Belle II detector at the KEK laboratory in Japan was successfully “rolled-in” to the collision point of the upgraded SuperKEKB accelerator, marking an important milestone for the international B-physics community. The Belle II experiment is an international collaboration hosted by KEK in Tsukuba, Japan, with related physics goals to those of the LHCb experiment at CERN but in the pristine environment of electron–positron collisions. It will analyse copious quantities of B mesons to study CP violation and signs of physics beyond the Standard Model (CERN Courier September 2016 p32).

“Roll-in” involves moving the entire 8 m-tall, 1400 tonne Belle II detector system from its assembly area to the beam-collision point 13 m away. The detector is now integrated with SuperKEKB and all its seven subdetectors, except for the innermost vertex detector, are in place. The next step is to install the complex focusing magnets around the Belle II interaction point. SuperKEKB achieved its first turns in February 2016, with operation of the main rings scheduled for early spring and phase-III “physics” operation by the end of 2018.

Compared to the previous Belle experiment, and thanks to major upgrades made to the former KEKB collider, Belle II will allow much larger data samples to be collected with much improved precision. After six years of gruelling work with many unexpected twists and turns, it was a moving and gratifying experience for everyone on the team to watch the Belle II detector move to the interaction point, says Belle II spokesperson Tom Browder. Flavour physics is now the focus of much attention and interest in the community and Belle II will play a critical role in the years to come.

CERN on the road

CERN has begun major work to create a new visitor space called Esplanade des Particules, to welcome the ever-growing numbers of visitors to the laboratory each year. The project, undertaken in conjunction with the Etat de Genève, will integrate the laboratory better into the local urban landscape, making it more open and easily accessible, with work to last until summer 2018.

A competition was launched in 2011 to showcase the public entrance to CERN. Landscape-architects Studio Paolo Bürgi won with a design for a large space dedicated to pedestrians that connects CERN’s reception to the Globe of Science and Innovation. The Esplanade des Particules will see the current “Flags Car Park” replaced by a blue pedestrianised area in which the flags of CERN Member States will cross the main road to the laboratory.

LHCb finds new hints of Standard Model discrepancy

At a seminar at CERN on 18 April, the LHCb collaboration presented new results in flavour physics that show an interesting departure from Standard Model (SM) predictions. The new measurement concerns a parameter called RK*0, which is the ratio of the probabilities that a B0 meson decays to K*0μ+μ and to K*0e+e (where the K*0 meson was reconstructed through its decay into a charged kaon K+ and a pion π).

"

Lepton universality – a cornerstone of the SM – states that leptons have the same couplings to gauge bosons and therefore that RK*0 is expected to be close to unity (apart from well-understood effects related to the different masses of the leptons, which change this value slightly). Any conclusive observation of a violation of this rule would indicate the existence of physics beyond the SM. Based on analysis of data from Run 1, the LHCb measurement differs from the prediction with a significance between 2.1 and 2.5 standard deviations in the two regions of q2 (the μ+μ or e+e invariant mass squared) in which the measurement is performed.

Three years ago, LHCb found a similar discrepancy for the quantity RK – in which the B0 meson is replaced by a B+ and the K*0 meson by a K+. In addition, another class of measurements concerning different ratios of B-meson decay rates involving τ and muon leptons also exhibit some tensions with predictions. While intriguing, none of the differences are yet at the level where they can be claimed to exhibit evidence for physics beyond the SM.

The LHCb collaboration has a wide programme of lepton-universality tests based on different R measurements in which other particles replace the K*0 or K+ mesons in the ratios. The RK*0 and RK measurements so far were obtained using the entire Run 1 data sample, corresponding to an integrated luminosity of 3 fb–1 at an energy of 7 and 8 TeV. Data collected in Run 2 already provide a sample more than twice as large, and it is therefore of great importance to see whether updates of the present analysis will confirm or rule out the discrepancies.

bright-rec iop pub iop-science physcis connect