by Jørgen Rammer, Cambridge University Press. Hardback ISBN 9780521874991 £45 ($85). E-book format ISBN 9780511292620, $68.
This textbook presents quantum field theoretical applications to systems out of equilibrium. It introduces the real-time approach to non-equilibrium statistical mechanics and the quantum field theory of non-equilibrium states in general. It offers two ways of learning how to study non-equilibrium states of many-body systems: the mathematical canonical way and an intuitive way using Feynman diagrams. The latter provides an easy introduction to the powerful functional methods of field theory, and the use of Feynman diagrams to study classical stochastic dynamics is considered in detail. The developed real-time technique is applied to study numerous phenomena in many-body systems, and there are numerous exercises to aid self-study.
by Noboru Miura, Oxford University Press. Hardback ISBN 9780198517566 £65 ($150).
This book describes the basic concepts of various physical phenomena in semiconductors and their modulated structures under high magnetic fields. The topics cover magneto-transport phenomena, cyclotron resonance, far-infrared spectroscopy, magneto-optical spectroscopy, diluted magnetic semiconductors in high magnetic fields, as well as the recent advances in the experimental techniques needed for high field experiments. Starting from the introductory part describing the basic theoretical background, each chapter introduces typical experimental data, obtained in very high magnetic fields mostly in the pulsed field range at 20–100 T. The book will serve as a useful guide for researchers and students with an interest in semiconductor physics or in high magnetic fields.
By Maurizio Gasperini, Cambridge University Press. Hardback ISBN 9780521868754 £45. E-book format ISBN 9780511332296 $68.
The standard cosmological picture of our universe emerging from a Big Bang leaves open many fundamental questions, which string theory, a unified theory of all forces of nature, should be able to answer. The first book dedicated to string cosmology, this contains a pedagogical introduction to the basic notions of the subject. It describes the new possible scenarios suggested by string theory for the primordial evolution of our universe and discusses the main phenomenological consequences of these scenarios, stressing their differences from each other, and comparing them to the more conventional models of inflation. It is self-contained, and so can be read by astrophysicists with no knowledge of string theory, and high-energy physicists with little understanding of cosmology. Detailed and explicit derivations of all the results presented provide a deeper appreciation of the subject.
By Sergio Ferrara and Rudolf M Mössbauer, World Scientific Series in 20th Century Physics, Volume 39. Hardback ISBN 9789812700186 £69 ($128).
The “superworld” is a subject of formidable interest for the immediate future of subnuclear physics to which Antonino Zichichi has contributed with a series of important papers of phenomenological and theoretical nature. These papers represent a must-have collection, not only for their originality, but also for their complete analysis of expected scenarios on the basis of today’s knowledge of physics. The contributions are divided into two parts. The first deals with the problem of the convergence of the three fundamental forces of nature measured by the gauge couplings, with the onset of the energy threshold for the production of the lightest supersymmetric particles and with the existence of a gap between the string scale and the GUT scale. The second deals with the study of a theoretical model capable of including supersymmetry with the minimum number of parameters (possibly one), and agreeing with all the conditions established by string theories – this turns out to be a “one-parameter no-scale supergravity” model whose experimental consequences are investigated for present and future facilities aimed at the discovery of the first example of the superparticle.
By Guido Caldarelli, Oxford University Press. Hardback ISBN 9780199211517, £49.95 ($115).
This book presents the experimental evidence for scale-free networks and provides students and researchers with theoretical results and algorithms to analyse and understand these features. A variety of different social, natural and technological systems – from the Internet to food webs and boards of company directors – can be described by the same mathematical framework. In all these situations a graph of the elements of the system and their interconnections displays a universal feature: there are few elements with many connections, and many elements with few connections. The content and exposition make this a useful textbook for beginners, as well as a reference book for experts in a variety of disciplines.
by Giuliano Benenti, Giulio Casati and Giuliano Strini, World Scientific. Hardback ISBN 9789812563453 £33 ($58). Paperback ISBN 9789812565280 £22 ($38).
Quantum computation and information is a new, rapidly developing interdisciplinary field. Building on the basic concepts introduced in Volume I, this second volume deals with various important aspects, both theoretical and experimental, of quantum computation and information in depth. The areas include quantum data compression, accessible information, entanglement concentration, limits to quantum computation due to decoherence, quantum error-correction, and the first experimental implementations of quantum information protocols. This volume also includes a selection of special topics, including quantum trajectories, quantum computation and quantum chaos, and the Zeno effect.
Edited by M Shifman, World Scientific. Hardback ISBN 978-981-270-532-7, £41 ($75). Paperback ISBN 9789812705334, £21 ($39).
Felix Berezin was an outstanding Soviet mathematician who was the driving force behind the emergence in the 1960s and 1970s of the branch of mathematics, now known as supermathematics. The integral over the anti-commuting Grassmann variables that he introduced in the 1960s laid the foundation for the path integral formulation of quantum field theory with fermions, the heart of modern supersymmetric field theories and superstrings. This book features a masterfully written memoir by Berezin’s widow, Elena Karpel, who narrates a remarkable account of his life and struggle for survival under the totalitarian Soviet regime. Supplemented with recollections by close friends and colleagues, Berezin’s accomplishments in mathematics, his novel ideas and breakthrough works, are reviewed in two articles written by Andrei Losev and Robert Minlos.
CERN would like to express its gratitude to the following for having generously sponsored the various events in the “LHC 2008” programme, and in particular the offical inauguration of the LHC on 21 October 2008.
Gold Sponsor
INEO GDF Suez, Regione Sicilia.
Main Sponsors
Air Liquide, ALSTOM Power, ASG Superconductors SpA, ATI Wah Chang, Babcock Noell GMBH, HAMAMATSU Photonics, Intel, La Fondation Meyrinoise pour la Promotion Culturelle, Sportive et Sociale, Linde Kryotechnik AG, Luvata, Oracle, Ville et Etat de Genève and UBS.
Sponsors
CECOM Snc, Efacec, Farnell, Force10 Networks, La Mobilière Suisse, Peugeot Gerbier, Sun Microsystems SA, TRANSTEC Computer AG and Western Digital.
Associate Sponsors
Accel, Arcelor Mittal, Bruun & Sorensen, CAEN SpA, Carlson Wagonlit, CEGELEC, DELL SA, E4 computer engineering SpA, EAS European Advanced, EOS, Ernesto Malvestiti SpA, IBM, IEEE, Infotrend Europe Ltd, Iniziative Industriali Srl, ISQ, Italkrane Srl, Kaneka, La Tour Réseau de Soins, Migros, National Instruments, ProCurve Networking by HP, SERCO, Société Générale, Sunrise Communications AG, Super Micro Computer, Tosti srl and Xerox.
Will the LHC surprise us? I hope so. Having failed to find any completely unexpected new physics for more than 30 years, we clearly need nature’s help to progress, and the case is good.
The last really big surprise in particle physics was the discovery of the third charged lepton (the tau) in 1975. There have of course been many extremely important discoveries since then, and our understanding of particle physics has advanced enormously. But the only real surprises have been how well the Standard Model has worked, the accuracy with which experiments have been able to check its predictions, and the failure to find its missing ingredient (the mechanism that gives particles their masses: Higgs?), or any other physics beyond the Standard Model, apart from the major discovery of neutrino masses (which, however, was not a huge surprise as no principle required zero mass).
By the time of the major LEP summer study in 1978 the Standard Model was accepted by many, but by no means all, theorists and gaining supporters among experimenters. It was thought that “the (CERN) proton–antiproton collider [which had just been launched] should discover the Z, but apart from measuring its mass (with considerable errors) it will not allow us to investigate its properties in detail (it may also discover the W but this looks more difficult)”. It was argued that LEP1 would be needed to study the Z in detail (or, if it did not exist, discover what else damps the rising weak cross section at LEP energies, where the phenomenological low energy theory had to be wrong), and measure the number of neutrinos into which it can decay; LEP2 would be needed to study the W, and find the Higgs boson (or whatever else generates masses) if it had not been found at LEP1. The surprises (at least for theorists like me) were how easy it was to detect the W (which was discovered in 1983, shortly before the Z) and the accuracy of the LEP results, which led to the exciting discovery that the strengths of the electromagnetic and strong forces converge at high energies, supporting the idea that they are different manifestations of a single “grand unified” force.
At the 1978 LEP summer study the importance of insisting on a relatively long tunnel in order not to compromise the energy of a later proton accelerator or LHC was discussed, and this argument was used when LEP was approved in 1981. The first serious discussion of LHC physics took place in 1984. It was obvious that the time had come to launch R&D on LHC magnets but “less clear whether it is sensible to discuss (LHC) physics…without more complete results from the SPS collider, let alone data from LEP, SLC and HERA…crystal gazing is unusually hazardous following recent tantalizing hints of new discoveries from UA1 and UA2”. These hints, which turned out to be spurious (along with other hints of non-standard physics, from Fermilab neutrino experiments, LEP, and other experiments), remind us of the difficulty of exploring the frontier: we should not be surprised if there are false dawns at the LHC.
In 1984 it was stressed that the physics of mass generation was almost certain to be discovered at the LHC, if the question had not been settled at LEP, and that there are good reasons for expecting physics beyond the Standard Model in the LHC energy range – perhaps supersymmetry, which was discussed in some detail (it was only mentioned briefly at the 1978 summer study, although in the event a huge effort went into unsuccessful searches for supersymmetry at LEP). The case for the LHC was developed in more detail during the 1980s, but its essence has not changed.
The formal proposal to build the LHC presented to the CERN Council in 1993 was introduced with the statement that it will “provide an unparalleled ‘reach’ in the search for new fundamental particles and interactions between them, and is expected to lead to new, unique insights into the structure of matter and the nature of the universe”. The LHC will take us a factor of 10 further in energy (at the level of the proton’s constituents) or equivalently to a tenth of the distance scale that has been explored so far. This alone is enough to whet scientific appetites. But pulses are really set racing by the knowledge that the LHC has a good chance of finding what generates masses (a single elementary Higgs field? Multiple or composite Higgs fields?…?) and may cast light on other mysteries, including: why the mass of the W is so small compared to the scale of the proposed grand unification of electroweak and strong interactions, the magnitude of the asymmetry between matter and anti-matter in the universe, the number of quarks and leptons, and the origin of the dark matter and dark energy that pervade the universe.
What do I expect? I am fairly confident that Higgs, in some form, will show up. If the LHC finds the standard Higgs boson and nothing else I would be extremely disappointed as we would learn essentially nothing. (The biggest surprise would be to find nothing, which would take us nowhere, while making the case for going to much higher energies compelling but probably impossible to sell.) I think there is a reasonable probability that supersymmetry will be found, and I hope this happens: the most convincing arguments are that it is the only possible symmetry allowed by quantum field theory (the mathematical language of particle physics) that has not been found (why would nature utilise all possibilities but one?); “local” supersymmetry (and all the other “continuous” symmetries are local) requires the existence of gravity; and the idea of connecting matter (fermions) with force carriers (bosons) is very appealing, although against this must be set the extravagant proliferation of particles (none found, yet?) that this implies. I am somewhat less impressed by the fact that supersymmetry would stabilize the mass of the W, which is one of the arguments that could put supersymmetry in reach of the LHC.
Thanks to the dedication of the CERN staff the LHC is now starting, and thanks to the community of users around the world, the experiments are ready to take data. It is a fantastic project. I am confident that it will work superbly. I am almost certain that it will make important discoveries, and I hope they will include surprises.
There’s a famous photograph of a young Nepalese climber standing on top of Everest in 1953. It’s the only picture there is, but Tenzing Norgay was not alone. Edmund Hillary, who declined to be photographed, accompanied him to the top. Who got there first? For a while, the two climbers refused to be drawn, saying that what matters is the achievement. And so it is with a mechanism developed in the 1960s to account for the difference between long and short-range interactions in physics.
In the early 1960s, particle physics had a problem. Long-range interactions, such as electromagnetism and gravity, could be explained by the theories of the day, but the short-range weak interaction, whose influence is limited to the scale of the atomic nucleus, could not. The idea that the carriers of the weak force must be heavy, while the carriers of long-range forces would be massless could account for the difference. Conceptually it made sense, but theoretically it couldn’t be done: where would the heavy carriers get their mass? There was no way to reconcile massive and massless force carriers in the same theoretical framework.
Inspired by the new theory of superconductivity put forward in the late 1950s by John Bardeen, Leon Cooper and John Schreiffer, theorist Yoichiro Nambu paved the way to a solution by postulating the idea that a broken symmetry could generate mass. In doing so he in turn inspired three young physicists in Europe to take the next step.
A modest beginning
I met one of those physicists, Peter Higgs, in autumn 2007 in his apartment on the top floor of a walk-up block in Edinburgh new town with views over a leafy square. A slice from an LHC magnet greets visitors to the apartment, where the style is 1970s chic. Copies of Physics World and Scientific American are piled high on the coffee table, topped off with a copy of the satirical paper Private Eye. Bound copies of The Gramophone line the shelves, and the living room’s prominent feature is a chair, optimally placed to make best use of the audiophile Leak hi-fi system.
A few months later, I met Robert Brout and François Englert in a spartanly furnished office, of the kind frequently occupied by professors emeriti, at the Université Libre de Bruxelles. Do we speak English or French was my first question. “Robert will be happier with English,” came the reply. I hadn’t realised that Brout was a naturalized Belgian, and that the two had first worked together in 1959 when he’d hired Englert to join him in his work at Cornell University in statistical mechanics.
As is so often the way with good ideas, the concept of the generation of particle mass through symmetry breaking was developed in more than one place at around the same time, two of those places being Brussels and Edinburgh. It was a modest beginning for a scientific revolution: just two short pages published on 31 August 1964 by Brout and Englert, and little more than a page from Higgs on 15 September. But those two papers were set to influence profoundly the development of particle physics right to this day.
All three scientists are careful to attribute credit to their forerunners, Nambu most strongly. Hints of other influences come from the fact that Higgs has been known to call spontaneous symmetry breaking in particle physics the relativistic Anderson mechanism, a reference to the Nobel prize-winning physicist Philip Anderson who published on the subject in 1963; and in lectures at Imperial College London students are told about the Kibble–Higgs mechanism, in a reference to a later paper published by Gerald Guralnik, Carl Hagen and Tom Kibble.
Brout’s inspiration goes back much further, to another place that symmetry is broken spontaneously in nature with macroscopic effects. “Ferromagnetism was a puzzle in 1900,” he told me, and was solved by French physicist Pierre Weiss in 1907. Essentially, symmetry is broken by the Brout–Englert–Higgs (BEH) mechanism because the ground state of the vacuum is asymmetric, rather like the alignment of the electrons’ magnetic moments in a ferromagnetic material. In the case of the BEH mechanism, however, it’s structure in the vacuum itself that gives rise to particle masses. In the words of CERN’s Alvaro de Rújula: “The vacuum is not empty, there is a difference between vacuum and emptiness.”
The thing that fills the vacuum is a scalar field commonly known as the Higgs field. Some particles interact strongly with this field, others don’t, and it is the strength of the interaction with the field that determines the masses of certain particles. In other words, the carriers of the weak interaction, the W and Z particles, are sensitive to the structure of empty space. This is how the BEH mechanism can accommodate short and long-range interactions in a single theory. The long-awaited confirmation of the mechanism is expected in the form of excitations of the field appearing as scalar bosons (Higgs particles).
Esoteric as this may seem, there are potential astronomical implications, since what particle physicists call the Higgs field, cosmologists call the cosmological constant, or dark energy. A substance that appears to make up some 70% of the universe’s matter and energy, dark energy made itself apparent as recently as 2003 in observations of the farthest reaches of the universe.
Renormalization
Despite the emergence of the BEH mechanism, particle physics still had a problem in the mid-1960s, because the underlying theory was literally not normal. It predicted abnormal results, such as probabilities of more than 100% for given outcomes. It needed to be renormalized, and that would take the best part of a decade. Brout and Englert toyed with the idea in 1966, but a rigorous renormalization had to wait until 1971, when Gerardus ‘t Hooft, a student of Martinus Veltman at Utrecht University, published the first of a series of papers by student and supervisor that would rigorously prove the renormalizability of the theory. They were rewarded with a trip to Stockholm in 1999 to collect the Nobel Prize in Physics.
If Brout, Englert and Higgs had provided a cornerstone of the Standard Model, ‘t Hooft and Veltman gave it its foundations. From then, theoretical and experimental progress was rapid, and accompanied by a rich harvest of Nobel Prizes. In 1973, a team at CERN led by André Lagarrigue found the first evidence for heavy carriers of the weak interaction. In 1979, Sheldon Glashow, Steven Weinberg and Abdus Salam received the Nobel Prize for Physics for their work on unifying the electromagnetic and weak interactions, the theory in which the BEH mechanism plays its crucial role. Then in 1984, Carlo Rubbia and Simon van der Meer received the Nobel Prize for their decisive contributions to the programme that discovered the carriers of the weak force, the W and Z particles, at CERN in 1982–1983.
“The experimental discovery of the W and Z particles confirmed both the validity of the electroweak model,” explained François Englert “and of the BEH mechanism.” There remained, however, a missing ingredient. A machine was needed that could shake the scalar boson of the BEH mechanism out of its hiding place in the vacuum of space. That machine is the LHC. Many scientists would, and indeed have, bet on the discovery of the particle, but however elegant and enticing the work of Brout, Englert and Higgs, no-one can be sure it is right until the scalar boson has been seen. Nature might have chosen to endow particles with mass in a different way, so until the particle is found, the BEH mechanism remains no more than speculation. Whatever the case, the LHC will give us the answer.
There are many stories as to how the BEH mechanism and its associated particle came to be named after Higgs. The one Higgs told me involves a meeting that he had with fellow theorist Ben Lee at a conference in 1967, at which they discussed Higgs’s work. Then along came renormalization, making field theory fashionable, and another conference. “The conference at which my name was attached to pretty well everything connected with spontaneous symmetry breaking in particle physics was in ’72,” explained Higgs. It was a conference at which Lee delivered the summary talk.
Brout, Englert and Higgs have rarely met, but they have much in common. All came to a field, unfashionable with particle theorists at the time, from different areas of science. “Sometimes you do things in a domain in which you are not an expert and it plays a big role,” explained Englert. “We had no reason to dismiss field theory because people didn’t use it.” The three also agree on many things – their inspiration for one. “What was interesting me back in the early 1960s was the work of Nambu, who was proposing field theories of elementary particles in which symmetries were broken spontaneously in analogy to the way that it happens in a superconductor,” said Higgs. Englert said it slightly differently: “We were very impressed by the fact that Nambu transcribed superconductivity in terms of field theory,” he said. “That’s a beautiful paper.”
The three are in agreement about the results that the LHC might bring. “The most uninteresting result would be if we find nothing other than that which we’re most expecting,” said Englert. According to Higgs: “The most uninteresting result would be if they found the Higgs boson and nothing much else.” “If the Standard Model works, then we’re in trouble,” said Brout. “We’ll have to rely on human intelligence to go further,” said Englert completing the thought. And the most interesting direction for physics? Gravity, they all concur. “Any crumbs that fall off it would have major effects on the world of elementary particles,” said Brout, “in my heart, gravity is the secret to everything.”
Physicists and mountaineers have much in common. They are on the whole fiercely competitive, yet collaborative at the same time, and they can be magnanimous to an extraordinary degree. “I was delighted to discover that we are sharing the prize,” Higgs said on being informed that the European Physical Society had awarded him a prestigious prize in 1997. “I get a lot of publicity for this work, but (Brout and Englert) were clearly ahead of me.”
So who did get there first? At Everest, it turns out to have been Hillary who put his foot on the summit first. In physics Brout and Englert were first to publish, but that’s not what matters. In physics, as in mountaineering, it’s the achievement that counts.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.