This book is a collection of essays on various physics topics, which the author aims at presenting in a manner that is accessible to non-experts and, specifically, to non-physics science and arts students at the undergraduate level. The author is motivated by the conviction that understanding fundamental concepts of other subjects facilitates out-of-the-box thinking, which can result in making original contributions to one’s chosen field.
The selection of topics is very personal: some basic-physics concepts, such as standards for units and oscillation theory, are placed next to discussions about general relativity and the famous twin paradox. The author uses an informal style and has particular interest in dispelling some myths about science.
The final chapters cover topics from his area of research, atomic and optical physics, focusing on the Nobel Prizes assigned in the last two decades to scientists in these fields.
Even though the use of equations is kept to a minimum, some mathematics and physics background is required of the reader.
Raw Data is a scientific novel that explores the moral dilemmas surrounding the accidental discovery of a case of scientific misconduct within a top US biomedical institute.
The choice of subject is interesting and unusual. Scientific misconduct is not an unprecedented topic for scientific novels, but the focus is usually on spectacular frauds that clearly violate the ethos of the scientific community. This story depicts a more nuanced situation. Readers may even find themselves understanding, if not condoning, the conscious decision of one of the co-protagonists to cheat.
This character chooses to “cut a corner” out of fear of being scooped, to satisfy an unreasonably picky reviewer who had requested an additional control experiment that she deems irrelevant. The stakes for her career are huge because she is competing with other groups on the same research line, and publishing second would cost her a great deal academically. When a co-worker accidentally finds hints of her fabrication and immediately alerts the laboratory’s principal investigator, both find themselves in a bitter no-win situation. “Doing the right thing” has a significant cost, but any other option potentially entails far worse consequences for their careers and their reputations.
Along the way, the author illustrates vividly how people in research think, feel, work and live. Work–life balance in science, especially for young female researchers, is a secondary theme of the book. Overall, the portrait of academia is not a flattering one, but definitely faithful. As someone who works in high-energy physics, I learnt about day-by-day practices in the biomedical sector and how it differs from mine. Although the author focuses on her own area of the scientific environment, some descriptions of “postdoc life” are quite general.
This relatively short novel is followed by a long Q&A section with the author, a former biomedical researcher who left the field after some considerable career achievements. There she makes her opinions explicit about several of the topics, including the “publish or perish” attitude, work–life balance, scientific integrity, and what she perceives as systemic dangers for the academic research world.
Although the author clearly made an effort to simplify the science to the minimum needed to understand the plot (and as a reader with no understanding of microbiology I found her effort successful), I am not sure that a reader with no previous interest in science would be hooked by the story. The book is well written, but the plot has a slow pace and, while Springer deserves credit for publishing it, the text contains many typographical errors.
Overall, I recommend the book to other scientists, regardless of their specialisation, and to the scientifically educated public who may appreciate this insider view of contemporary research life.
One of the most intriguing works in the philosophy of science is Wigner’s 1960 paper titled “The Unreasonable Effectiveness of Mathematics in the Natural Sciences”. Indeed the fact that so many natural laws can be formulated in this language, not to mention that some of these represent the most precise knowledge we have about our world, is a stunning mystery.
A related question is whether mathematics, which has largely developed overlapping or in parallel with physics, is constructed by the human mind or “discovered”. This question is worth asking again today, when modern theories of fundamental physics and contemporary mathematics have reached levels of abstraction that are unimaginable from the perspective of just 100 years ago.
This book is a collection of essays discussing the connection between physics and mathematics. They are written by the winners of the 2015 Foundational Questions Institute contest, which invited contributors – from professional researchers to members of the public – to propose an essay on the topic.
Since it appears primarily as a subject of the philosophy of science rather than of science itself, it is not a surprise that there are conflicting viewpoints that sometimes reach opposite conclusions.
A significant point of view is that the claimed effectiveness of mathematics is actually not that surprising. This is because we process information and generate knowledge about our world in an inadvertently biased way, namely as a result of the evolution of our mind in a specific physical world. For example, concepts of elementary geometry (such as straight lines, parabolas, etc) and the mechanics of classical physics are deeply imprinted in the human brain as evolutionary bias. In a fuzzy, chaotic world, such naive mathematical notions might not have developed, as they wouldn’t represent a good approximation to that world. In fact, in a drastically unstructured world it would have been less likely that life had evolved in the first place, so it may not seem such a surprise that we find ourselves in a world largely governed by relatively simple geometrical structures.
What remains miraculous, on the other hand, is the effectiveness of mathematics in the microscopic realm of quantum mechanics: it is not obvious how the mathematical notions on which it is based could be explained in terms of evolutionary bias. Actually, much of the progress of fundamental physics during the last 100 years or so crucially depended on abandoning the intuition of everyday common sense, in favour of abstract mathematical principles.
Another aspect is selection bias, in that failures of the mathematical description of certain phenomena tend simply to be ignored. A prime example is human consciousness – undoubtedly a real-world phenomenon – for which it is not at all clear whether its structure can ever be mapped to mathematical concepts in a meaningful way. A quite common reductionist point of view typical of particle physicists is that, since the brain is essentially chemistry (thus physics), a mathematical underpinning is automatic. But it may be that the way such complex phenomena emerge completely obfuscates the connection to the underlying, mathematically clean microscopic physics, rendering the latter useless for any practical purpose in this regard.
This raises the issue of the structure of knowledge per se, and some essays in this book argue that it may not necessarily be hierarchical but rather scale invariant with some, or many, distinguished nodes. One may think of these as local attractors to which “arrows of deeper explanation” point. It may be that only locally near such attractors does knowledge appear hierarchical, so that, for example, our mathematical description of fundamental physics is meaningful only near one particular such node. There might be other local attractors that are decoupled from our mathematical modelling, with no obvious chains of explanation linking them.
On a different tack, a vehemently dissimilar and extreme point of view is taken by adepts of Tegmark’s mathematical universe hypothesis, which has been directly addressed by various authors. This posits that there is actually no difference between mathematics and the physical world, so the role of mathematics in our physical world appears as a tautology.
Surveying all the thoughts in this collection of essays would be beyond the scope of this review. Suffice it to say that the book should be of great interest to anybody pondering the meaning of physical theories, although it appears more useful for scientists rather than for the general public. It is not an easy read, but the reader is rewarded with a great deal of food for thought.
The Large Hadron Collider Physics (LHCP) conference took place at Shanghai Jiao Tong University (SJTU) in China, on 15–20 May. One of the largest annual conferences in particle physics, the timing of LHCP2017 chimed with fresh experimental results from the ALICE, ATLAS, CMS and LHCb experiments based on 13 TeV LHC data recorded during 2015–2016. The conference saw many new results presented and also offered a broad overview of the scientific findings from Run 1, based on lower-energy data.
One of the main themes of the conference was the interplay between different results from various experiments, in particular those at the LHC, and the need to continue to work closely with the theory community. One such example concerns measurements of rare B-meson decays and in particular the decay B0→ K*l+l–, which is sensitive to new physics and could probe the presence of new particles through the study of the B0 helicity structure. The LHCb collaboration has found several discrepancies with Standard Model (SM) expectations, including a more than three standard-deviation discrepancy in the angular distributions of this B0 decay. New results presented by ATLAS and CMS have created further tension in the situation (see diagram), and more data from LHC Run 2 and continued theoretical developments will be critical in understanding these decays.
An exciting result from the ALICE experiment showed a surprising enhancement of strange-baryon production in proton–proton collisions (CERN Courier June 2017 p10). In nucleus–nucleus collisions, this enhancement is interpreted as a signature of the formation of a quark–gluon plasma (QGP) – the extreme state that characterised the early universe before the appearance of hadrons. The first observation of strangeness enhancement in high-multiplicity proton–proton collisions hints that the QGP is also formed in collisions of smaller systems and opens new directions for the study of this primordial state of matter.
From the Higgs sector, CMS reported an observation of Higgs decays to two particles with a significance of 4.9 standard deviations compared to SM backgrounds. Differential cross-sections for Higgs decays to two Z bosons, which test properties of the Higgs such as its spin and parity and also act as a probe of perturbative QCD, were shown by ATLAS. Throughout the conference, it was clear that precision studies of the Higgs sector are a critical element in elucidating the nature of the Higgs boson itself, as well as understanding electroweak symmetry breaking and searching for physics beyond the SM.
In addition to these highlights, a broad spectra of results were presented. These ranged from precision studies of the SM, such as new theoretical developments in electroweak production, to numerous new search results, such as searches for low-mass dark-sector mediators from the CMS experiment and searches for supersymmetry in very high-multiplicity jet events for ATLAS. The conclusion from the conference was clear: we have learnt a tremendous amount from the Run 2 LHC data but are left with many open questions. We therefore eagerly await the newest data from the LHC to help further dissect the SM, cast light on the nature of the Higgs, or to find an entirely new particle.
The 8th International Particle Accelerator Conference (IPAC) took place in Copenhagen, Denmark, on 14–19 May and was attended by more than 1550 participants from 34 countries. Hosted by the European Spallation Source (ESS) and organised under the auspices of the European Physical Society (EPS) accelerator group and the International Union of Pure and Applied Physics, the event was also supported by the MAX-IV facility and Aarhus University.
Although accelerators were initially developed to understand the infinitesimal constituents of matter, they have evolved into sophisticated instruments for a wide range of fundamental and applied research. Today, particle accelerators serve society in numerous ways, ranging from medicine and energy to the arts and security. Advanced light sources are a case in point, following the steady improvement in their performance in terms of brilliance and temporal characteristics. MAX-IV and the ESS, which lie just across the Oresund bridge in Sweden, are two of the most powerful instruments available to life and material scientists, and are operating and under construction, respectively. Meanwhile, the most brilliant source of ultra-short flashes of X-rays – the European X-ray Free Electron Laser at DESY in Hamburg – has recently achieved first lasing and will soon be open to users (“Europe enters the extreme X-ray era”). Another X-ray free-electron laser, the SwissFEL at PSI, has just produced laser radiation for the first time in the soft X-ray regime and aims to achieve smaller wavelengths by the end of the year. New synchrotron light sources have also come into operation, such as the SOLARIS synchrotron in Poland, and major upgrades to the European Synchrotron Radiation Facility in France based on a new lattice concept are planned.
Particle physics remains one of the main drivers for new accelerator projects and for R&D in IPAC’s many fields. The big brother of all accelerators, CERN’s LHC, performed outstandingly well during 2016, exceeding nominal luminosity by almost 50% thanks to operations with more tightly spaced bunches and due to the higher brightness of the beams delivered by the LHC injectors. Mastering the effects of electron clouds and carrying out progressive “scrubbing” of the surfaces of the LHC beam screens have been key to this performance. Achieving nominal luminosity marks the completion of one of the most ambitious projects in science and bodes well for the High Luminosity LHC upgrade programme now under way. IPAC17 also heard the latest from experiments at CERN’s Antiproton Decelerator facility, including the trapping and subsequent spectroscopic measurements of antihydrogen atoms and the exciting studies that will still be carried out using the new ELENA facility there.
One of the pioneers of silicon radiation detectors, Gerhard Lutz, passed away in Vienna on 28 April. He will be remembered for numerous inventions that shaped the field of silicon detectors, his deep insight into detector physics and analysis methods, his role as mentor of many young scientists, and his modest and charming personality.
Gerhard Lutz was born in Klagenfurt, Austria, in 1939. He studied physics at the Technical University of Vienna and obtained his PhD from the University of Hamburg under Willibald Jentschke, the founder of DESY and later a Director-General of CERN. His thesis concerned the coherent bremsstrahlung and pair production on diamond crystals using the DESY synchrotron, and demonstrated the production of GeV photons with a polarisation in excess of 70%. In 1967 he moved to Northeastern University in Boston and contributed to a spectrometer experiment at Brookhaven, which had aimed to follow up spectacular results reported earlier by the “CERN missing mass spectrometer”: the splitting of the A2 resonance and the observation of narrow high-mass resonances. Based on high-quality data and a painstaking analysis, he showed that the CERN results were incorrect.
In 1972, Lutz took a position at MPI-Munich and initiated a precision measurement of the reaction π– p(↑) → π– π+ n. He organised and ran the experiment, wrote the event-reconstruction software and developed the complex mathematical formalism necessary to interpret the results – marking a milestone in the understanding of exclusive hadronic reactions. In the late 1970s the CERN–Munich Group expanded into the ACCMOR collaboration, which pioneered the use of high-precision silicon tracking detectors. Together with Josef Kemmer and Robert Klanner, he developed silicon microstrip detectors using planar technology and built the vertex telescope for the CERN fixed-target experiments NA11 and NA32. The achieved precision of this device (5 μm), its ability to operate reliably in a high-intensity beam and identify charm particles against a huge background of hadronic events, unleashed the success story of silicon detectors. Today, practically all high-energy physics experiments rely on this technology.
Lutz’s contributions in the field of silicon detectors are numerous: the understanding of detector instabilities due to surface effects; the development of double-sided silicon-strip detectors; the concept of fully depleted pnCCDs based on the principle of sideway depletion; the realisation of novel concepts for silicon sensors with intrinsic gain; and the invention of the DePFET detector-amplifier structure. His developments found their way into many experiments outside particle physics, in particular in astrophysics and X-ray science, and also industry. Lutz co-founded the Max-Planck-Institut Halbleiterlabor (HLL) semiconductor laboratory in 1992, the research company PNSensor in 2002, and the instrumentation company PNDetector in 2007. Until the very end he contributed to the success of both companies with his sharp mind and inventions, while his guidance, inspiration and ideas have been essential for the success of semiconductor developers in the Munich area.
Those who had the opportunity to work with Gerhard Lutz appreciated his gentle and quiet way, his competence and deep insight. His scientific standards were very high and he detested superficial statements. His unconventional and original ideas inspired many colleagues and students, and his book Semiconductor Radiation Detectors has become a classic in the field. Gerhard Lutz’s innovative and influential work was honoured by the 1966 Röntgen Award, the 2011 Radiation Instrumentation Outstanding Achievement Award, and the 2017 High Energy Physics Prize of the European Physical Society (see “EPS awards prizes for high-energy physics”).
Cécile DeWitt-Morette, founder of the Les Houches summer school, passed away on 8 May at the age of 94. Born in Caen, she studied in Paris after completing her bachelor degree. In 1944 her mother, sister and grandmother were killed in the Allied bombing of Caen, but in Paris she secured a job at CNRS and was awarded a PhD three years later with a thesis about meson production. She was then invited to the Institute for Advanced Study in Princeton by Robert Oppenheimer, where she met her future husband, the US physicist Bryce DeWitt (they would go on to have four daughters).
Mixing with the best of US physics made her realise the poor situation of the field in France, especially particle physics, and drove her to do something about it. Precisely at that time, a summer school was organised at the university of Michigan in Ann Arbor, and Cécile had the idea to create such an event in France. Her beautiful eyes with double-iris rings and considerable powers of persuasion, not to mention a fantastic intuition for selecting the best possible lecturers, were difficult to resist. She had a friend whose father, the architect Albert Laprade, loaned her a piece of land at La Côte des Chavants, just above the village of Les Houches in the Arve valley, among farms and cottages. Financial input soon followed thanks to her skilful negotiating tactics, and in the summer of 1951 I was one of a few candidates to attend the school for a period of three months. She had chosen fantastic professors: Léon Van Hove for quantum mechanics and Viki Weisskopf for nuclear physics, both of whom would be future Director-Generals of CERN; Res Jost for field theory; Walter Kohn (a future Nobel Prize winner) for solid state physics; plus seminars by giants such as Wolfgang Pauli. We worked very hard, except for some excursions in the mountains, and learnt a lot.
The Les Houches school, of which Cécile remained director for 22 years, continued to be a complete success. Many of its students and some teachers received the Nobel prize, the Wolff prize or the Fields Medal. Among them were Pierre Gilles De Gennes and Claude Cohen-Tannoudji. The demand for basic courses dissipated over the years, but the school became a place for high-level specialised topics, and continues to be so.
Cécile also played an important role in founding the Institut des Hautes Scientifiques (IHES) in Bures sur Yvette, and did important work on functional integration, also collaborating with her mathematical-physicist husband. They were professors at the University of North Carolina at Chapel Hill and at the University of Texas at Austin, successively. Bryce died in 2004 just as he was about to receive the Einstein Prize from the American Physical Society.
I met Cécile for the last time at IHES in 2011 where she was made Officier De la Légion dʼHonneur. On 7 May, the day before she died, I understand that she was delighted to learn that the anti-European candidate as president of France, Marine Le Pen, had been defeated.
On 27 June, representatives of CERN and the Republic of Lithuania signed an agreement in the capital city of Vilnius admitting Lithuania as an associate Member State. The agreement will enter into force once official approval is received from the Lithuanian government.
“The involvement of Lithuanian scientists at CERN has been growing steadily over the past decade, and associate membership can now serve as a catalyst to further strengthen particle physics and fundamental research in the country,” said CERNʼs Director-General, Fabiola Gianotti. “We warmly welcome Lithuania to the CERN family, and look forward to enhancing our partnership in science, technology development and education and training.”
Lithuania’s relationship with CERN dates back to a co-operation agreement signed in 2004, which paved the way to participation of Lithuanian universities and scientific institutions in high-energy physics experiments at CERN. Lithuania has been a long-time contributor to the CMS experiment and has also played an important role in developing databases for the experiment. The country actively promoted the BalticGrid in 2005, and more generally participates in detector development relevant to the High Luminosity LHC.
Lithuania’s associate membership will strengthen the long-term partnership between CERN and the Lithuanian scientific community. It will allow Lithuania to take part in meetings of CERN Council and its committees, and Lithuanian scientists will be eligible for staff appointments. Finally, once the agreement enters into force, Lithuanian industry will be entitled to bid for CERN contracts.
First stable beams in the LHC were declared on 23 May, just 25 days after the first beam was injected and almost three weeks ahead of schedule. Since then, interleaved with physics operation and remaining commissioning activities, the LHC teams have been busy ramping up the intensity of the beams. During this procedure, the number of proton bunches circulating the machine is increased in a stepwise manner: beginning with three bunches per beam and going up to 12, 72, 300, 600, 900, 1200, 1800, 2400 and finally 2556 bunches per beam. To ensure that all systems work as they should, each step requires a minimum of 20 hours of stable-beam operation and that the machine is filled three times. As the Courier went to press on 28 June, 2556 bunches were circulating in the machine and already the experiments had clocked an integrated luminosity of around 5 fb–1.
Another important procedure during the LHC restart is the so-called scrubbing run to condition the vacuum chamber, which took place in early June. Despite the ultra-high vacuum of the LHC beam pipe, residual gas molecules and electrons remain trapped on the walls of the chamber and can be liberated by the circulating beam, eventually heating the walls and destabilising the beam. Such “electron-cloud” effects can be reduced by repeatedly filling the LHC with closely spaced bunches, provoking intense electron clouds that gradually become less prone to produce further electrons.
The rapid and smooth restart of the LHC this year, which marks the continuation of Run 2 at a centre-of-mass energy of 13 TeV, is due to the excellent availability of the machine and its injector chain, and also the dedication of many specialists. The LHC is now ready to continue the intensity ramp for physics-data collection, with the ambitious goal of reaching an integrated luminosity of 45 fb–1 for 2017.
The LHCb collaboration has discovered a new weakly decaying particle: a baryon called the Ξ++cc, which contains two charm quarks and an up quark. The discovery of the new particle, which was observed decaying to the final-state Λ+c K–π+π+ and is predicted by the Standard Model, was presented at the European Physical Society conference in Venice on 6 July.
Although the quark model of hadrons predicts the existence of doubly heavy baryons – three-quark states that contain two heavy (c or b) quarks – this is the first time that such states have been observed unambiguously with overwhelming statistical significance (well in excess of 5σ with respect to background expectations). The properties of the newly discovered Ξ++cc baryon shed light on a long-standing puzzle surrounding the experimental status of doubly charmed baryons, opening an exciting new branch of investigation for LHCb.
The team scrutinised large high-purity samples of Λ+c→ p K–π+ decays in LHC data recorded at 8 and 13 TeV in 2012 and 2016, respectively, and discovered an isolated narrow structure in the Λ+c K–π+π+ mass spectrum (associating the Λ+c baryon with further particles) at a mass of around 3620 MeV/c2. After eliminating all known potential artificial sources, the collaboration concluded that the highly significant peak is a previously unobserved state. Corroboration that it is the weakly decaying Ξ++cc came from examining a subset of data in which the reconstructed baryons lived for a measurable period before decaying. Such a requirement eliminates all promptly decaying particles, leaving only long-lived ones that are the hallmark of weak transitions.
Although the existence of baryons with valence-quark content ccu and ccd (corresponding to the Ξ++cc and its isospin partner Ξ+cc) is expected, the experimental status of these states has been controversial. In 2002, the SELEX collaboration at Fermilab in the US claimed the first observation of this class of particle by observing a significant peak of about 16 events at a mass of 3519±1 MeV/c2 in the Λ+c K–π+ mass spectrum, which they identified as the closely related state Ξ+cc. Puzzlingly, the short lifetime (which was too small to be measured at SELEX) and the very large production rate of the state seemed not to match theoretical expectations for the Ξ+cc. Despite SELEXʼs confirmation of the observation in a second decay mode, all subsequent searches – including efforts at the FOCUS, BaBar and Belle experiments – failed to find evidence for doubly charmed baryons. That left both theorists and experimentalists awaiting a firm observation by a more powerful heavy-flavour detector such as LHCb. Although the new result from LHCb does not fully resolve the puzzle (with a mass difference of 103±2 MeV/c2, LHCbʼs Ξ++cc and SELEXʼs Ξ+cc seem irreconcilable as isospin partners), the discovery is a crucial step to an empirical understanding of the nature of doubly heavy baryons.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.