This book provides an introduction to the theory of quantum transport at the nanoscale – a rapidly developing field that studies charge, spin and heat transport in nanostructures and nanostructured materials. The theoretical models and methods recollected in the volume are widely used in nano-, molecular- and bio-electronics, as well as in spin-dependent electronics (spintronics).
The book begins by introducing the basic concepts of quantum transport, including the Landauer–Büttiker method; the matrix Green function formalism for coherent transport; tunnelling (transfer) Hamiltonian and master equation methods for tunnelling; Coulomb blockade; and vibrons and polarons.
In the second part of the book, the author gives a general introduction to the non-equilibrium Green function theory, describing first the approach based on the equation-of-motion technique, and then a more sophisticated one based on the Dyson–Keldysh diagrammatic technique. The book focuses in particular on the theoretical methods able to describe the non-equilibrium (at finite voltage) electron transport through interacting nanosystems, specifically the correlation effects due to electron–electron and electron–vibron interactions.
The book would be useful for both masters and PhD students and for researchers or professionals already working in the field of quantum transport theory and nanoscience.
This monograph presents a comparative analysis of different thermodynamic models of the equation of state (EOS). The author aims to present in a unified way both the theoretical methods and experimental material relating to the field.
Particular attention is given to the description of extreme states reached at high pressure and temperature. As a substance advances along the scale of pressure and temperature, its composition, structure and properties undergo radical changes, from the ideal state of non-interacting neutral particles described by the classical statistical Boltzmann function to the exotic forms of baryon and quark–gluon matter.
Studying the EOS of matter under extreme conditions is important for the study of astrophysical objects at different stages of their evolution as well as in plasma, condensed-matter and nuclear physics. It is also of great interest for the physics of high-energy concentrations that are either already attained or can be reached in the near future under controlled terrestrial conditions.
Ultra-extreme astrophysical and nuclear-physical applications are also analysed. Here, the thermodynamics of matter is affected substantially by relativity, high-power gravitational and magnetic fields, thermal radiation, the transformation of nuclear particles, nucleon neutronisation, and quark deconfinement.
The book is intended for a wide range of specialists who study the EOS of matter and high-energy-density physics, as well as for senior students and postgraduates.
Nowadays, enormous quantities of data in a variety of forms are generated rapidly in fields ranging from social networks to online shopping portals to physics laboratories. The field of “big data” involves all the tools and techniques that can store and analyse such data, whose volume, variety and speed of production are not manageable using traditional methods. As such, this new field requires us to face new challenges. These challenges and their possible solutions are the subject of this book of 17 chapters, which is clearly divided into two sections: data management and security.
Each chapter, written by different authors, describes the state-of-the-art for a specific issue that the reader may face when implementing a big-data solution. Far from being a manual to follow step-by-step, topics are treated theoretically and practical uses are described. Every subject is very well referenced, pointing to many publications for readers to explore in more depth.
Given the diversity of topics addressed, it is difficult to give a detailed opinion on each of them, but some deserve particular mention. One is the comparison between different communication protocols, presented in depth and accompanied by many graphs that help the reader to understand the behaviour of these protocols under different circumstances. However, the black-and-white print makes it difficult to differentiate between the lines in these graphs. Another topic that is nicely introduced is the SP (simplicity and power) system, which makes use of innovative solutions to aspects such as the variety of data when dealing with huge amounts. Even though the majority of the topics in the book are clearly linked to big data, some of them are related to broader computing topics such as deep-web crawling or malware detection in Android environments.
Security in big-data environments is widely covered in the second section of thebook, spanning cryptography, accountability and cloud computing. As the authors point out, privacy and security are key: solutions are proposed to successfully implement a reliable, safe and private platform. When managing such amounts of data, privacy needs to be carefully treated since delicate information could be extracted.The topic is addressed in several chapters from different points of view, from looking at outsourced data to accountability and integrity. Special attention is also given to cloud environments, since they are not as controlled as those “in house”. Cloud environments may require data to be securely transmitted, stored and analysed to avoid access by unauthorised sources. Proposed approaches to apply security include encryption, authorisation and authentication methods.
The book is a good introduction to many of the aspects that readers might face or want to improve in their big-data environment.
This mighty 840 page book covers an impressive range of subjects divided into no less than 45 chapters. Owing to the expertise and international reputations of the authors of the individual chapters, few if any other books in this field have managed to summarise such a broad topic with such authority. While too numerous to list in the space provided, the full list of authors – a veritable “who’s who” of the accelerator world – can be viewed at worldscientific.com/worldscibooks/10.1142/8635#t=toc.
The book opens with two chapters devoted to a captivating historical review of the Standard Model and a general introduction to accelerators, and closes with two special sections. The first of these is devoted to novel accelerator ideas: plasma accelerators, energy-recovery linacs, fixed-field alternating-gradient accelerators, and muon colliders. The last section describes European synchrotrons used for tumour therapy with carbon ions and covers, in particular, the Heidelberg Ion Therapy Centre designed by GSI and the CERN Proton Ion Medical Machine Study. The last chapter describes the transformation of the CERN LEIR synchrotron into an ion facility for radiobiological studies.
Concerning the main body of the book, 17 chapters look back over the past 100 years, beginning with a concise history of the three first lepton colliders: AdA in Frascati, VEP-1 in Novosibirsk and the Princeton–Stanford electron–electron collider. A leap in time then takes the reader to CERN’s Large Electron–Positron collider (LEP), which is followed by a description of the Stanford Linear Collider. Unfortunately, this latter chapter is too short to do full justice to such an innovative approach to electron–positron collisions.
The next section is devoted to beginnings, starting from the time of the Brookhaven Cosmotron and Berkeley Bevatron. The origin of alternating-gradient synchrotrons is well covered through a description of the Brookhaven AGS and the CERN Proton Synchrotron. The first two hadron colliders at CERN – the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS) proton–antiproton collider – are then discussed. The ISR’s breakthroughs were numerous, including the discovery of Schottky scans, the demonstration of stochastic cooling and absolute luminosity measurements by van der Meer scans. Even more remarkable was the harvest of the SPS proton–antiproton collider, culminating with the Nobel prize awarded to Carlo Rubbia and Simon van der Meer. The necessary Antiproton Accumulator and Collector are discussed in a separate chapter, which ends with an amusing recollection: “December 1982 saw the collider arriving at an integrated luminosity of 28 inverse nanobarns and Rubbia offering a ‘champagne-only’ party with 28 champagne bottles!” Antiproton production methods are covered in detail, including a description of the manoeuvres needed to manipulate antiproton bunches and of the production of cold antihydrogen atoms. This subject is continued in a later chapter dedicated to CERN’s new ELENA antiproton facility.
The Fermilab proton–antiproton collider started later than the SPS, but eventually led to the discovery of the top quark by the CDF and D0 collaborations. The Fermilab antiproton recycler and main ring are described, followed by a chapter dedicated to the Tevatron, which was the first superconducting collider. The first author remarks that, over the years, some 1016 antiprotons were accumulated at Fermilab, corresponding to about 17 nanograms and more than 90% of the world’s total man-made quantity of nuclear antimatter. This section of the book concludes with a description of the lepton–proton collider HERA at DESY, the GSI heavy-ion facility, and the rare-isotope facility REX at ISOLDE. Space is also given to the accelerator that was never built, the US Superconducting Super Collider (SSC), of which “the hopeful birth and painful death” is recounted.
The following 25 chapters are devoted to accelerators for the 21st century, with the section on “Accelerators for high-energy physics” centred on the Large Hadron Collider (LHC). In the main article, magisterially written, it is recalled that the 27 km length of the LEP tunnel was chosen having already in mind the installation of a proton–proton collider, and the first LHC workshop was organised as early as 1984. The following chapters are dedicated to ion–ion collisions at the LHC and to the upgrades of the main ring and the injector. The high-energy version of the LHC and the design of a future 100 km-circumference collider (with both electron–positron and proton–proton collision modes) are also covered, as well as the proposed TeV electron–proton collider LHeC. The overall picture is unique, complete and well balanced.
Other chapters discuss frontier accelerators: super B-factories, the BNL Relativistic Heavy Ion Collider (RHIC)and its electron–ion extension, linear electron–positron colliders, electron–positron circular colliders for Higgs studies and the European Spallation Source. Special accelerators for nuclear physics, such as the High Intensity and Energy ISOLDE at CERN and the FAIR project at GSI, are also discussed. Unfortunately, the book does not deal with synchrotron light sources, free electron lasers and high- power proton drivers. However, the latter are discussed in connection with neutrino beams by covering the CERN Neutrinos to Gran Sasso project and neutrino factories.
The book is aimed at engineers and physicists who are already familiar with particle accelerators and may appreciate the technical choices and stories behind existing and future facilities. Many of its chapters could also be formative for young people thinking of joining one of the described projects. I am convinced that these readers will receive the book very positively.
By S Mobilio, F Boscherini and C Meneghini (eds)
Springer
Observed for the first time in 1947 – and long considered as a problem for particle physics as it can cause particle beams to lose energy – synchrotron radiation is today a fundamental tool for characterising nanostructures and advanced materials. Thanks to its characteristics in terms of brilliance, spectral range, time structure and coherence, it is extensively applied in many scientific fields, spanning material science, chemistry, nanotechnology, earth and environmental sciences, biology, medical applications, and even archaeology and cultural heritage.
The book reports the lecture notes of lessons held at the 12th edition of the School on Synchrotron Radiation, held in Trieste, Italy, in 2013 and organised by the Italian Synchrotron Radiation Society in collaboration with Elettra-Sincrotrone Trieste. The book is organised in four parts. The first describes the emission of synchrotron and free-electron laser sources, as well as the basic aspects of beamline instrumentation. In the second part, the fundamental interactions between electromagnetic radiation and matter are illustrated. The third part discusses the most important experimental methods, including different types of spectroscopy, diffraction and scattering, microscopy and imaging techniques. An overview of the numerous applications of these techniques to various research fields is then given in the fourth section. In this, a chapter is also dedicated to the new generation of synchrotron radiation sources, based on free-electron lasers, which are opening the way to new applications and more precise measurements.
This comprehensive book is aimed at both PhD students and more experienced researchers, since it not only provides an introduction to the field but also discusses relevant topics of interest in depth.
Established in 2007 to fund frontier-research projects, the European Research Council (ERC) has quickly become a fundamental instrument of science policy at European level, as well as a quality standard for academic research. This book traces the history of the creation and development of the ERC, drawing on the first-hand knowledge of the author, who was scientific adviser to the president of the ERC for four years. It covers the period between the early 2000s – when a group of strong-minded scientists pushed the idea of allocating (more) money to research projects selected for the quality of the proposals, judged by independent, competent and impartial reviewers – and when the first ERC programme cycle was concluded in 2013.
The author is particularly interested in the politics behind those events and shows how the ERC could translate into reality thanks to the fact that the European Commission decided to support it, using a much more strategic, planned and technical approach. He also describes the way that the ERC was implemented and the creation of its scientific council, discusses the “hybrid” nature of the ERC – being somewhere between a programme and an institution – and the consequent frictions in its early days, as well as the process to establish a procedure for selecting applications for funding.
While telling the story of the ERC from a critical perspective and examining its challenges and achievements, the book also offers a view of the relationship between science and policy in the 21st century.
By Waldyr A Rodrigues Jr and Edmundo Capelas de Oliveira
Springer
The Many Faces of Maxwell, Dirac and Einstein Equations
In theoretical physics, hardly anything is better known than the Einstein, Maxwell and Dirac equations. The Dirac and Maxwell equations (as well as the analogous Yang–Mills equations) form the basis of the modern description of matter via the electrodynamic, weak and strong interactions, while Einstein’s equations of special and general relativity are the foundations of the theory of gravity. Taken together, these three equations cover scales from the subatomic to the large-scale universe, and are the pillars on which the standard models of cosmology and particle physics are built. Although they constitute core information for theoretical physicists, they are rarely, if ever, presented together.
This book aims to remedy the situation by providing a full description of the Dirac, Maxwell and Einstein equations. The authors go further, however, by presenting the equations in several different forms. Their aim is twofold. On one hand, different expressions of these famous formulae may help readers to view a given equation from new and possibly more fruitful perspectives (when the Maxwell equations are written in the form of the Navier–Stokes equations, for instance, they allow a hydrodynamic interpretation of the electrodynamic field). On the other hand, casting different equations in similar forms may shed light on the quest for unification – as happens, for example, when the authors rewrite Maxwell’s equations in Dirac-like form and use this to launch a digression on supersymmetry.
Another feature of the book concerns concepts in differential geometry that are widely used in mathematics but about which there is little knowledge in theoretical physics. An example is the torsion of space–time: general differential manifolds are naturally equipped with a torsion in addition to the well-known curvature, and torsion also enters into the description of Lie algebras, yet the torsional completion of Einstein gravity, for instance, has been investigated very little. In the book, the authors take care of this issue by presenting the most general differential geometry of space–time with curvature and torsion. They then use this to understand conservation laws, more specifically to better grasp the conditions under which these conservation laws may or may not fail. Trivially, a genuine conservation law expresses the fact that a certain quantity is constant over time, but in differential geometry there is no clear and unambiguous way to define an absolute time.
As an additional important point, the book contains a thorough discussion about the role of active transformations for physical fields (to be distinguished from passive transformations, which are simply a change in co-ordinates). Active transformations are fundamental, both to define the transformation properties of specific fields and also to investigate their properties from a purely kinematic point of view without involving field equations. A section is also devoted to exotic or new physical fields, such as the recently introduced “ELKO” field.
Aside from purely mathematical treatments, the book contains useful comments about fundamental principles (such as the equivalence principle) and physical effects (such as the Sagnac effect). The authors also pay attention to clarifying certain erroneous concepts that are widespread in physics, such as assigning a nonzero rest mass to the photon.
In summary, the book is well suited for anyone who has an interest in the differential geometry of twisted–curved space–time manifolds, and who is willing to work on generalisations of gravity, electrodynamics and spinor field theories (including supersymmetry and exotic physics) from a mathematical perspective. Perhaps the only feature that might discourage a potential reader, which the authors themselves acknowledge in the introduction, is the considerable amount of sophisticated formalism and mathematical notation. But this is the price one has to pay for such a vast and comprehensive discussion about the most fundamental tools in theoretical physics.
Lying midway between the history and the philosophy of science, this book illuminates a fascinating period in European history during which mathematics clashed with common thought and religion. Set in the late 16th and early 17th centuries, it describes how the concept of infinitesimals – a quantity that is explicitly nonzero and yet smaller than any measurable quantity – took a central role in the debate between ancient medieval ideas and the new ideas arising from the Renaissance. The former were represented by immutable divine order and the principle of authority, the latter by social change and experimentation.
The idea of indivisible quantities and their use in geometry and arithmetic, which had already been developed by ancient Greek mathematicians, underwent its own renaissance 500 years ago, at the same time as Martin Luther launched the Reformation. The consequences for mathematics and physics were enormous, giving rise to unprecedented scientific progress that continued for the following decades and centuries. But even more striking is that the new way of thinking built around the concept of infinitesimals crossed the borders of science and strongly influenced society, up to the point that mathematics became the main focus of the struggle between the old and new orders.
This book is divided into two parts, each devoted to a particular geographical area and period in which this battle took place. The first part leads the reader to late 16th century Italy, where the flourishing and creative ideas of the Renaissance had given birth to a prolific number of mathematicians and scientists. Here, the prominent figure of Galileo Galilei – together with Evangelista Torricelli, Bonaventura Cavalieri and others – was at the forefront of the new mathematical approach involving the concept of infinitesimals. This established the basis of inductive reasoning, which makes broad generalisations from specific observations, and led to a new science founded on experience. On the opposite side, the religious congregation of the Jesuits used these same mathematical developments in its fight against heresy and the Reformation. To them, the traditional mathematical approach was a solid basis for the absolute truth represented by the Catholic faith and the authority of the Pope. The fierce opposition of the Jesuit mathematicians led Galileo and the “infinitesimalists” to damnation, with irreparable consequences for the ancient tradition of Italian mathematicians.
The second part of the book moves the reader to 17th century England, just after the English Civil War in the years of Cromwell’s republic and the Restoration. In that context, the new ideas represented by infinitesimals were not only condemned by the Anglican Church but also opposed by political powers. Here, the leading figure of Thomas Hobbes took the stage in the fight against the indivisibles and the inductive method. For him, traditional Euclidean geometry – which, contrary to induction, used deduction to achieve any result from a few basic statements – was the highest expression of an ordered philosophical system and a model for a perfect state. Hobbes was also concerned about the threat to the principle of authority that emanated from traditional mathematical thought. In his struggle against infinitesimals, he was confronted by the members of the newly founded Royal Society, eager for scientific progress. Among them was John Wallis, who considered mathematical knowledge as a “down–up” inductive system in which calculus played the role of experiments in physics. Solving many of the toughest mathematical problems of his times by infinitesimal procedures, Wallis defeated traditional geometry – and Thomas Hobbes with it. The triumph of Wallis made way for scientific progress and the advance of thought that opened the door to the Enlightenment.
This book is excellently written and its mathematical concepts are clearly explained, making it fully accessible to a general audience. With his fascinating narrative, the author intrigues the reader, depicting the historical background and, in particular, recounting the plots of the Holy See, the Jesuits’ fight for power, the Reformation, the absolutist power of the kings, and the early steps of Europeans towards democracy and freedom of thought. The book includes extensive notes at the end, a useful index of concepts, a timeline and a “dramatis personae” section, which is divided between “infinitesimalists” and “non-infinitesimalists”. Finally, the images and portraits included in the book enhance the enjoyment for the reader.
Gaseous photomultipliers are gas-filled devices capable of detecting single photons (in the visible and UV spectrum) with a high position resolution. They are used in various research settings, in particular high-energy physics, and are among several types of contemporary single-photon detectors. This book provides a detailed comparison between photosensitive detectors based on different technologies, highlighting their advantages and disadvantages of them for diverse applications.
After describing the main principles underlying the conversion of photons to photoelectrons and the electron avalanche multiplication effect, the characteristics (and requirements) of position-sensitive gaseous photomultipliers are discussed. A long section of the book is then dedicated to describing and analysing the development of these detectors, which evolved from photomultipliers filled with photosensitive vapours to devices using liquid and then solid photocathodes. UV-sensitive photodetectors based on caesium iodide and caesium telluride, which are mainly used as Cherenkov-ring imaging detectors and are currently employed in the ALICE and COMPASS experiments at CERN, are presented in a dedicated chapter. The latest generation of gaseous photomultipliers, sensitive up to the visible region, are also discussed, as are alternative position-sensitive detectors.
The authors then focus on the Cherenkov light effect, its discovery and the way it has been used to identify particles. The introduction of ring imaging Cherenkov (RICH) detectors was a breakthrough and led to the application of these devices in various experiments, including the Cosmic AntiParticle Ring Imaging Cherenkov Experiment (CAPRICE) and the former CERN experiment Charge Parity violation at Low Energy Antiproton Ring (CP LEAR).
The latest generation of RICH detectors and applications of gaseous photomultipliers beyond RICH detectors are also discussed, completing the overview of the subject.
By S Tackmann, K Kampmann and H Skovby (eds)
Forlaget Historika/Gad Publishers
This book, which includes a contribution by CERN Director-General Fabiola Gianotti, presents 17 radical and game-changing ideas to help reach the 2030 Global Goals for Sustainable Development identified by the United Nations General Assembly.
Renowned and influential leaders propose innovative solutions for 17 “big bets” that the human race must face in the coming years. These experts in the environment, finance, food security, education and other relevant disciplines share their vision of the future and suggest new paths towards sustainability.
In the book, Gianotti replies to this call and shares her ideas about the importance of basic science and research in science, technology, engineering and maths (STEM) to underpin innovation, sustainable development and the improvement of global living conditions. After giving examples of breakthrough innovations in technology and medicine that came about from the pursuit of knowledge for its own sake, Gianotti contends that we need science and scientifically aware citizens to be able to tackle pressing issues, including drastic reduction of poverty and hunger, and the provision of clean and affordable energy. Finally, she proposes a plan to secure STEM education and funding for basic scientific research.
Published as part of the broader Big Bet Initiative to engage stakeholders around new and innovative ideas for global development, this book provides fresh points of view and credible solutions. It would appeal to readers who are interested in innovation and sustainability, as well as in the role of science in such a framework.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.