Comsol -leaderboard other pages

Topics

Synchrotron Radiation: Basics, Methods and Applications

By S Mobilio, F Boscherini and C Meneghini (eds)
Springer

61At-RFIPgL

Observed for the first time in 1947 – and long considered as a problem for particle physics as it can cause particle beams to lose energy – synchrotron radiation is today a fundamental tool for characterising nanostructures and advanced materials. Thanks to its characteristics in terms of brilliance, spectral range, time structure and coherence, it is extensively applied in many scientific fields, spanning material science, chemistry, nanotechnology, earth and environmental sciences, biology, medical applications, and even archaeology and cultural heritage.

The book reports the lecture notes of lessons held at the 12th edition of the School on Synchrotron Radiation, held in Trieste, Italy, in 2013 and organised by the Italian Synchrotron Radiation Society in collaboration with Elettra-Sincrotrone Trieste. The book is organised in four parts. The first describes the emission of synchrotron and free-electron laser sources, as well as the basic aspects of beamline instrumentation. In the second part, the fundamental interactions between electromagnetic radiation and matter are illustrated. The third part discusses the most important experimental methods, including different types of spectroscopy, diffraction and scattering, microscopy and imaging techniques. An overview of the numerous applications of these techniques to various research fields is then given in the fourth section. In this, a chapter is also dedicated to the new generation of synchrotron radiation sources, based on free-electron lasers, which are opening the way to new applications and more precise measurements.

This comprehensive book is aimed at both PhD students and more experienced researchers, since it not only provides an introduction to the field but also discusses relevant topics of interest in depth.

The European Research Council

By Thomas König
Polity Press

31t2kmMw9GL._SX330_BO1,204,203,200_

Established in 2007 to fund frontier-research projects, the European Research Council (ERC) has quickly become a fundamental instrument of science policy at European level, as well as a quality standard for academic research. This book traces the history of the creation and development of the ERC, drawing on the first-hand knowledge of the author, who was scientific adviser to the president of the ERC for four years. It covers the period between the early 2000s – when a group of strong-minded scientists pushed the idea of allocating (more) money to research projects selected for the quality of the proposals, judged by independent, competent and impartial reviewers – and when the first ERC programme cycle was concluded in 2013.

The author is particularly interested in the politics behind those events and shows how the ERC could translate into reality thanks to the fact that the European Commission decided to support it, using a much more strategic, planned and technical approach. He also describes the way that the ERC was implemented and the creation of its scientific council, discusses the “hybrid” nature of the ERC – being somewhere between a programme and an institution – and the consequent frictions in its early days, as well as the process to establish a procedure for selecting applications for funding.

While telling the story of the ERC from a critical perspective and examining its challenges and achievements, the book also offers a view of the relationship between science and policy in the 21st century.

The Many Faces of Maxwell, Dirac and Einstein Equations: A Clifford Bundle Approach (2nd edition)

By Waldyr A Rodrigues Jr and Edmundo Capelas de Oliveira
Springer

CCboo2_03_17

The Many Faces of Maxwell, Dirac and Einstein Equations
In theoretical physics, hardly anything is better known than the Einstein, Maxwell and Dirac equations. The Dirac and Maxwell equations (as well as the analogous Yang–Mills equations) form the basis of the modern description of matter via the electrodynamic, weak and strong interactions, while Einstein’s equations of special and general relativity are the foundations of the theory of gravity. Taken together, these three equations cover scales from the subatomic to the large-scale universe, and are the pillars on which the standard models of cosmology and particle physics are built. Although they constitute core information for theoretical physicists, they are rarely, if ever, presented together.

This book aims to remedy the situation by providing a full description of the Dirac, Maxwell and Einstein equations. The authors go further, however, by presenting the equations in several different forms. Their aim is twofold. On one hand, different expressions of these famous formulae may help readers to view a given equation from new and possibly more fruitful perspectives (when the Maxwell equations are written in the form of the Navier–Stokes equations, for instance, they allow a hydrodynamic interpretation of the electrodynamic field). On the other hand, casting different equations in similar forms may shed light on the quest for unification – as happens, for example, when the authors rewrite Maxwell’s equations in Dirac-like form and use this to launch a digression on supersymmetry.

Another feature of the book concerns concepts in differential geometry that are widely used in mathematics but about which there is little knowledge in theoretical physics. An example is the torsion of space–time: general differential manifolds are naturally equipped with a torsion in addition to the well-known curvature, and torsion also enters into the description of Lie algebras, yet the torsional completion of Einstein gravity, for instance, has been investigated very little. In the book, the authors take care of this issue by presenting the most general differential geometry of space–time with curvature and torsion. They then use this to understand conservation laws, more specifically to better grasp the conditions under which these conservation laws may or may not fail. Trivially, a genuine conservation law expresses the fact that a certain quantity is constant over time, but in differential geometry there is no clear and unambiguous way to define an absolute time.

As an additional important point, the book contains a thorough discussion about the role of active transformations for physical fields (to be distinguished from passive transformations, which are simply a change in co-ordinates). Active transformations are fundamental, both to define the transformation properties of specific fields and also to investigate their properties from a purely kinematic point of view without involving field equations. A section is also devoted to exotic or new physical fields, such as the recently introduced “ELKO” field.

Aside from purely mathematical treatments, the book contains useful comments about fundamental principles (such as the equivalence principle) and physical effects (such as the Sagnac effect). The authors also pay attention to clarifying certain erroneous concepts that are widespread in physics, such as assigning a nonzero rest mass to the photon.

In summary, the book is well suited for anyone who has an interest in the differential geometry of twisted–curved space–time manifolds, and who is willing to work on generalisations of gravity, electrodynamics and spinor field theories (including supersymmetry and exotic physics) from a mathematical perspective. Perhaps the only feature that might discourage a potential reader, which the authors themselves acknowledge in the introduction, is the considerable amount of sophisticated formalism and mathematical notation. But this is the price one has to pay for such a vast and comprehensive discussion about the most fundamental tools in theoretical physics.

Infinitesimal: How a Dangerous Mathematical Theory Shaped the Modern World

By Amir Alexander
One World

CCboo1_03_17

Lying midway between the history and the philosophy of science, this book illuminates a fascinating period in European history during which mathematics clashed with common thought and religion. Set in the late 16th and early 17th centuries, it describes how the concept of infinitesimals – a quantity that is explicitly nonzero and yet smaller than any measurable quantity – took a central role in the debate between ancient medieval ideas and the new ideas arising from the Renaissance. The former were represented by immutable divine order and the principle of authority, the latter by social change and experimentation.

The idea of indivisible quantities and their use in geometry and arithmetic, which had already been developed by ancient Greek mathematicians, underwent its own renaissance 500 years ago, at the same time as Martin Luther launched the Reformation. The consequences for mathematics and physics were enormous, giving rise to unprecedented scientific progress that continued for the following decades and centuries. But even more striking is that the new way of thinking built around the concept of infinitesimals crossed the borders of science and strongly influenced society, up to the point that mathematics became the main focus of the struggle between the old and new orders.

This book is divided into two parts, each devoted to a particular geographical area and period in which this battle took place. The first part leads the reader to late 16th century Italy, where the flourishing and creative ideas of the Renaissance had given birth to a prolific number of mathematicians and scientists. Here, the prominent figure of Galileo Galilei – together with Evangelista Torricelli, Bonaventura Cavalieri and others – was at the forefront of the new mathematical approach involving the concept of infinitesimals. This established the basis of inductive reasoning, which makes broad generalisations from specific observations, and led to a new science founded on experience. On the opposite side, the religious congregation of the Jesuits used these same mathematical developments in its fight against heresy and the Reformation. To them, the traditional mathematical approach was a solid basis for the absolute truth represented by the Catholic faith and the authority of the Pope. The fierce opposition of the Jesuit mathematicians led Galileo and the “infinitesimalists” to damnation, with irreparable consequences for the ancient tradition of Italian mathematicians.

The second part of the book moves the reader to 17th century England, just after the English Civil War in the years of Cromwell’s republic and the Restoration. In that context, the new ideas represented by infinitesimals were not only condemned by the Anglican Church but also opposed by political powers. Here, the leading figure of Thomas Hobbes took the stage in the fight against the indivisibles and the inductive method. For him, traditional Euclidean geometry – which, contrary to induction, used deduction to achieve any result from a few basic statements – was the highest expression of an ordered philosophical system and a model for a perfect state. Hobbes was also concerned about the threat to the principle of authority that emanated from traditional mathematical thought. In his struggle against infinitesimals, he was confronted by the members of the newly founded Royal Society, eager for scientific progress. Among them was John Wallis, who considered mathematical knowledge as a “down–up” inductive system in which calculus played the role of experiments in physics. Solving many of the toughest mathematical problems of his times by infinitesimal procedures, Wallis defeated traditional geometry – and Thomas Hobbes with it. The triumph of Wallis made way for scientific progress and the advance of thought that opened the door to the Enlightenment.

This book is excellently written and its mathematical concepts are clearly explained, making it fully accessible to a general audience. With his fascinating narrative, the author intrigues the reader, depicting the historical background and, in particular, recounting the plots of the Holy See, the Jesuits’ fight for power, the Reformation, the absolutist power of the kings, and the early steps of Europeans towards democracy and freedom of thought. The book includes extensive notes at the end, a useful index of concepts, a timeline and a “dramatis personae” section, which is divided between “infinitesimalists” and “non-infinitesimalists”. Finally, the images and portraits included in the book enhance the enjoyment for the reader.

Sri Lanka signs International Co-operation Agreement with CERN

On 8 February, Sri Lanka and CERN entered into an International Co-operation Agreement (ICA) concerning scientific and technical co-operation in high-energy physics. The agreement was signed by Susil Premajayantha, Sri Lankaʼs hon. minister of science, technology and research, and Charlotte Warakaulle, CERNʼs director for international relations.

The new agreement follows an Expression of Interest signed in June 2015 by Sri Lanka ambassador Ravinatha Aryasinha and the then Director-General Rolf-Dieter Heuer, which already incorporated Sri Lanka into CERN’s high-school teacher and summer-student programmes. Previously, scientists from Sri Lankan universities have participated in LHC experiments within the frameworks of sabbatical leave or similar programmes, whereas others have participated as visiting scientists employed by universities in a third country.

With the partnership now formalised via the new ICA, students, scientists, engineers and research institutes in Sri Lanka will be able to benefit from broader and more sustained participation in CERN, and thus be exposed to cutting-edge technology and research in high-energy physics.

“ICAs help to strengthen the global network for particle physics, which is essential for the future of the discipline and for fundamental research more generally,” says Warakaulle. “It is significant to see that a smaller, developing country is emphasising fundamental research and making the connection with CERN a priority. It testifies to an understanding of the value of fundamental research, which is commendable in a country that is facing other challenges. It also further enhances the CERN connection with South Asia, following the associate memberships of Pakistan and India.”

European computing cloud takes off

A European scheme to make publicly funded scientific data openly available has entered its first phase of development, with CERN one of several organisations poised to test the new technology. Launched in January and led by the UK’s Science and Technology Facilities Council, a €10 million two-year pilot project funded by the European Commission marks the first step towards the ambitious European Open Science Cloud (EOSC) project. With more than 30 organisations involved, the aim of the EOSC is to establish a Europe-wide data environment to allow scientists across the continent to exchange and analyse data. As well as providing the basis for better scientific research and making more efficient use of data resources, the open-data ethos promises to address societal challenges such as public-health or environmental emergencies, where easy access to reliable research data may improve response times.

The pilot phase of the EOSC aims to establish a governance framework and build the trust and skills required. Specifically, the pilot will encourage selected communities to develop demonstrators to showcase EOSC’s potential across various research areas including life sciences, energy, climate science, material science and the humanities. Given the intense computing requirements of high-energy physics, CERN is playing an important role in the pilot project.

The CERN demonstrator aims to show that the basic requirements for the capture and long-term preservation of particle-physics data, documentation, software and the environment in which it runs can be satisfied by the EOSC pilot. “The purpose of CERN’s involvement in the pilot is not to demonstrate that the EOSC can handle the complex and demanding requirements of LHC data-taking, reconstruction, distribution, re-processing and analysis,” explains Jamie Shiers of CERN’s IT department. “The motivation for long-term data preservation is for reuse and sharing.”

Propelled by the growing IT needs of the LHC and experience gained by deploying scientific workloads on commercial cloud services, explains Bob Jones of CERN’s IT department, CERN proposed a model for a European science cloud some years ago. In 2015 this model was expanded and endorsed by members of EIROforum. “The rapid expanse in the quantities of open data being produced by science is stretching the underlying IT services,” says Jones. “The Helix Nebula Science Cloud, led by CERN, is already working with leading commercial cloud service providers to support this growing need for a wide range of scientific use cases.”

The challenging EOSC project, which raises issues such as service integration, intellectual property, legal responsibility and service quality, complements the work of the Research Data Alliance and builds on the European Strategy Forum on Research Infrastructure (ESFRI) road map. “Our goal is to make science more efficient and productive and let millions of researchers share and analyse research data in a trusted environment across technologies, disciplines and borders,” says Carlos Moedas, EC commissioner for research, science and innovation.

Milestone for US dark-matter detector

The US Department of Energy (DOE) has formally approved a key construction milestone for the LUX-ZEPLIN (LZ) experiment, propelling the project towards its April 2020 goal for completion. On 9 February the project passed a DOE review and approval stage known as “Critical Decision 3”, which accepts the final design and formally launches construction. The LZ detector, which will be built roughly 1.5 km underground at the Sanford Underground Research Facility in South Dakota and be filled with 10 tonnes of liquid xenon to detect dark-matter interactions, is considered one of the best bets to determine whether dark-matter candidates known as WIMPs exist.

The project stems from the merger of two previous experiments: LUX (Large Underground Xenon) and ZEPLIN (ZonEd Proportional scintillation in LIquid Noble gases). It was first approved in 2014 and currently has about 250 participating scientists in 37 institutions in the US, UK, Portugal, Russia and Korea. The detector is expected to be at least 50 times more sensitive to finding signals from dark-matter particles than its predecessor LUX, and will compete with other liquid-xenon experiments under development worldwide in the race to detect dark matter. A planned upgrade to the current XENON1T experiment (called XENONnT) at Gran Sasso National Laboratory in Italy and China’s plans to advance the PandaX-II detector, for instance, are both expected to have a similar schedule and scale to LZ.

The LZ collaboration plans to release a Technical Design Report later this year. “We will try to go as fast as we can to have everything completed by April 2020,” says LZ project director Murdock Gilchriese. “We got a very strong endorsement to go fast and to be first.”

European organisations uphold scientific values

More than 50 science organisations in Europe have written an open letter expressing concern about the impact of recent US policies on science, research and innovation. The 10 February letter, which was organised by EuroScience (founder of the EuroScience Open Forum, ESOF), asks that the principles and values that underpin scientific progress are upheld. It is addressed to the presidents of the European Council and European Commission, and prime ministers and science ministers in individual European countries.

The European Physical Society (EPS) is among the many signatories of the letter, as are the Marie Curie Alumni Association, the Royal Society and the Royal Swedish Academy of Sciences. Explaining the decision to sign, outgoing EPS president Christophe Rossel says: “Science was and will never be restrained by physical, cultural and political barriers. In our globalised world, where international scientific collaboration has become the rule, there is no place for discrimination and censorship. Any measure that restricts the freedom of movement and communication of our US colleagues will have a profound impact on science and innovation in Europe and other continents.”

Three chief concerns are outlined in the letter: the recent Executive Order discriminating against persons on the basis of their nationality; indications that US government scientists might be affected by new policies that limit their communication with the press; and the unwarranted credibility given to views that are not based on facts and sound evidence in areas such as climate science. It states that all of these are at odds with the principles of transparency, open communication and the mobility of scholars and scientists, “which are vital to scientific progress and to the benefit of our societies, economies and cultures deriving from it”.

Chamonix event prepares for LHC’s future

2016 was a remarkably successful year for CERN’s Large Hadron Collider (LHC), marked by excellent peak performance, good availability and operational flexibility (CERN Courier December 2016 p5). Targeting further improvement, a thorough review of LHC operation and system performance was the focus of discussions in the first phase of the annual LHC performance workshop, which took place from 23 to 26 January in Chamonix, France.

Experts from the accelerator sector, CERN management and members of the CERN Machine Advisory Committee explored the operational scenarios for the remainder of Run 2 and made preliminary decisions regarding optics and machine parameters. Beam is due back in the LHC this year at the beginning of May, and the rest of the year will essentially be dedicated to proton–proton physics, with the usual mix of machine development and special physics runs. By quantifying the limitations to peak luminosity from electron-cloud effects, the cryogenics system and other factors, luminosity estimates for the coming years were also drawn up: in 2017, the peak luminosity should be at least 1.7 × 1034 cm–2 s–1 and the integrated luminosity target for ATLAS and CMS is 45 fb–1.

One open question about future LHC operations concerns the increase of the beam energy from 6.5 to 7 TeV per beam, which would see the machine reach its design specification. To gain input on high-field magnet behaviour, a dipole training campaign was conducted at the start of the year-end technical stop (CERN Courier March 2017 p9). Experience from this and previous training campaigns was reviewed and the duration, timing and associated risks of pushing up to 7 TeV – including implications for other accelerator systems, such as the LHC beam dump – were explored. There will be no change of beam energy in 2017 and 2018. The goal is to prepare the LHC to run at 14 TeV during Run 3 with the experiments expressing a clear preference to make the change in energy in a single step.

Regarding the longer-term future of the LHC, the High-Luminosity LHC (HL-LHC) demands challenging proton and ion beam parameters from the injector complex. The LHC injector upgrade (LIU) project is charged with planning and executing wide-ranging upgrades to the complex to meet these requirements. Both the LIU and HL-LHC projects have come through a recent cost-and-schedule review, and at present are fully funded and on schedule. The injector upgrades will be deployed during Long Shutdown 2 (LS2) in 2019/2020, while the HL-LHC will see the major part of its upgrades implemented in LS3, which is due to start in 2024.

With only two more years of operation before the next long shutdown, planning for LS2 is already well advanced. For the LHC itself, LS2 will not require the same level of intervention as seen in LS1. Nonetheless, here is still a major amount of work planned across the complex including major upgrades to the injectors in the framework of LIU, and significant upgrades to the LHC experiments.

The exploitation of the LHC and the injector complex has been impressive recently, but work across the Organization continues unabated in the push to get the best out of the LHC in both the medium and long term.

CMS undergoes tracker transplant

At the beginning of March, the CMS collaboration successfully replaced the heart of its detector: the pixel tracker. This innermost layer of the CMS detector, a cylindrical device containing 124 million sensitive silicon sensors that record the trajectories of charged particles, is the first to be encountered by debris from the LHC’s collisions.

CMS

The original three-layer 64 Mpix tracker, which has been in place since the LHC started operations in 2008, was designed for a lower collision rate than the LHC will deliver in the coming years. Its replacement contains an additional layer and has its first layer placed closer to the interaction point. This will enable CMS to cope with the harsher collision environment of future LHC runs, for which the detector has to simultaneously handle the products from a large number of simultaneous collisions. The new pixel detector will also be better at pinpointing where individual collisions occurred and will therefore enhance the precision with which predictions of the Standard Model can be tested.


After a week of intense activity, and a few frayed nerves, the new subdetector was safely in place by 8 March. After testing is complete, CMS will be closed ready for the LHC to return to action in May.

bright-rec iop pub iop-science physcis connect