Comsol -leaderboard other pages

Topics

Rooted in symmetry: Yang reflects on a life of physics

Chen Ning Yang first came to CERN in 1957, the year he shared the Nobel Prize in Physics with Tsung-Dao Lee for their proposal that the weak interaction violates parity symmetry – at a fundamental level, the mirror symmetry between left and right is broken. Almost 50 years later, Yang was again at CERN speaking to a packed auditorium about his thoughts on the important themes in physics over the second half of the 20th century. He can do so with authority: he not only knew great physicists such as Wolfgang Pauli and Paul Dirac, but he has also made many fundamental contributions to physics from the 1950s onwards.

CCEint1_01-07

When Yang arrived at CERN in 1957 the theory group was housed in a hut at Cointrin by the villa visible still behind fences surrounding the airport, and he recalls meeting people such as Jack Steinberger, Oreste Piccioni and Bruno Ferretti. But the visit also had a personal significance for Yang, who had lived in the US for 12 years, having left his native China in 1945. In the US he had gained his PhD, working under Edward Teller at Chicago University, and by 1957 he was married with a six-year-old son. It was a time of difficult relations between China and the US, with no possibility for Yang and his new family to meet with his parents in either country. However, the trip to Geneva offered Yang the opportunity to arrange for his father to come from China for a six-week visit and meet his wife and son. This happy experience was repeated on further visits to CERN in 1960 and 1962.

Throughout his long career Yang made many contributions to physics, achieving two of his best-known contributions to particle physics – Yang–Mills theory and parity violation – by the time he was 34. Yang says that he was fortunate to come into physics when the concept of symmetry was beginning to be appreciated.

In the 1920s people did not like the concept of symmetry, as they were sceptical of its new mathematics of groups – there were those who even talked of “the group pest”. But in the 1930s physicists began to realize that symmetry was necessary to describe atomic physics; in particular, symmetry groups explained the structure of the Periodic Table. By the 1940s its application had extended to nuclear and particle physics.

Yang worked on group theory for his PhD thesis under Teller, and this firmly anchored his interest in groups and the emerging field of symmetry in particle physics. He now reflects: “When young, the best thing that you can do is to launch yourself into a field that is just beginning.” This is exactly what Yang did.

Yang–Mills theory

By 1954 he had written with Robert Mills what he still regards as his most important paper, laying out the basic principles of what has become known as Yang–Mills theory. The theory is now a cornerstone of the Standard Model of particle physics, but at the time it did not agree with experiment. “We couldn’t escape the question of the mass of the spin-1 particles that come out of it,” recalls Yang, “although we did discuss it at the end of the paper and implied that there may be other reasons for the mass not being zero.” So why did they write the paper? Yang says that he appreciated the beauty of the structure and believed that it should be published. Samuel Goudsmit, who together with George Uhlenbeck had discovered the electron’s spin, was the editor and speedily published the paper.

On the subject of his Nobel prize-winning work with Lee, Yang says he was very proud of the paper on parity violation. “It caused a great sensation because of its ‘across the board’ character,” he recalls. “It was relevant to nuclear physics as well as high-energy physics. There were hundreds of experiments in the following two years.” The paper was published on 1 October 1956, and on 27 December C S Wu and her colleagues had the results that demonstrated that the parity is violated in weak decays. Yang says that Wu contributed more than just her technical expertise: “She did not believe the experiment would be so exciting, but believed that if an important principle had not been tested, it should be. No-one else wanted to do it!”

Since 1957 Yang has visited CERN many times and has seen the latest accelerator installations, each larger and more complex than the previous generation. This time he was taken to see preparations for ATLAS and CMS, the huge general-purpose detectors being built for the LHC. Yang says that seeing these installations is “very educational for a theorist who doesn’t tangle with these complex detectors and the engineers who are putting it all together”. He was “more than impressed” he says: “It is quite unbelievable. My only regret is that I may not be around to see the results.”

The changing face of particle physics

As the detectors become larger and more complex they are also being built and run by physicists and engineers who are collaborating on a very large scale. How particle physics is done has changed a great deal in the 50 years since Yang’s first visit to CERN. “Now group members are named by countries,” Yang says. “We have progressed from teams of colleagues in an institute, to several institutes, to several countries.” At CMS in particular he was impressed by all the young people from different countries who were participating in data-taking tests during his visit.

Looking to the future, Yang believes that astronomy is going to be an exciting field because so many peculiar aspects not yet understood will provide many opportunities for exploration. More fundamentally, he thinks that while the nature of physics has changed in the 21st century it will continue to thrive, resulting in important contributions to science.

So what of high-energy physics? Is it coming to an end? Yang believes that the type of particle physics studied over the past 50 years is not likely to continue for two reasons: one external and one internal. He points out that his generation was fortunate in that they launched into the unknown where there was a great deal to be discovered. Now, he says, we have reached marvellous collaboration efforts with the LHC, but there are limits to what governments will support. This is the external factor: funding will limit expansion unless there is some bright new idea. “We need to reduce the budget by a factor of 10,” he says.

As for the internal factor, he sees that the subject faces more difficult mathematical structures. He notes that field theory today has become highly nonlinear and is very difficult compared with what was thought to be difficult in the 1940s.

In the meantime, what does he think will be the most important discovery at the LHC? “Everybody is focusing on the Higgs and most feel it will be discovered,” he observes. “But,” he adds, “it may be more exciting eventually if it is not discovered.”

Physicists gather for an extravaganza of beauty

The 11th International Conference on B-Physics at Hadron Machines (Beauty 2006) took place on 25–29 September 2006 at the University of Oxford. This was the latest in a series of meetings dating back to the 1993 conference held at Liblice Castle in the Czech Republic. The aim is to review results in B-physics and CP violation and to explore the physics potential of current and future-generation experiments, especially those at hadron colliders. As the last conference in the series before the start-up of the LHC, Beauty 2006 was a timely opportunity to review the status of the field, and to exchange ideas for future measurements.

CCEphy1_01-07

More than 80 participants attended the conference, ranging from senior experts in the heavy-flavour field to young students. The sessions were held in the physics department, with lively discussions afterwards. There were fruitful exchanges between the physicists from operating facilities and those from future experiments (LHCb, ATLAS and CMS), with valuable input from theorists.

The conference reviewed measurements of the unitarity triangle, which is the geometrical representation of quark coupling and CP-violation in the Standard Model. The aim is to find a breakdown in the triangle through inconsistencies in the measurements of its sides and its angles, α, β and γ (φ2, φ1 _and φ3), as determined through CP-violating asymmetries and related phenomena.

The statistics and the quality of the data from the first-generation asymmetric energy e+ e B-factories are immensely impressive. The BaBar and Belle experiments, at PEPII and KEKB respectively, passed a significant milestone when they reached a combined integrated luminosity of 1000 fb-1 (1 ab-1), with 109 b bbar pairs now produced at the Υ(4S). The experiments are approved to continue until 2008 and should double their data-sets.

CCEphy2_01-07

The B-factories have studied with high precision the so-called golden mode of B-physics, the decay B0→J/ΨKs. The CP-asymmetry in this decay accesses sin 2β with negligible theoretical uncertainty and the measured world-average value in this and related channels is now 0.675 ± 0.026. The four-fold ambiguity in the value of β can be reduced to two by measuring cos 2β in channels such as B0→D*D*Ks. The results now strongly disfavour two of the solutions, although higher statistics and further theoretical effort are necessary to verify this interpretation.

A possible hint of physics beyond the Standard Model may appear in the measurement of sin 2β in b→s “penguin” decays (e.g. B0→φKs). There is a 2.6σ _discrepancy in the value averaged over a number of channels, namely sin 2β = 0.52 ± 0.05, when compared with the charmonium measurement. We need more data to resolve this ambiguity, and eagerly await further studies of these penguin processes at the LHC.

CCEphy3_01-07

BaBar and Belle have also produced important results related to the angles α and γ. The γ measurements are particularly interesting as it had generally been assumed that this parameter was beyond the scope of the B-factories. The angle is measured through the interference of tree-level B±→ D(*) K± and B± →Dbar(*)K± decay amplitudes. This strategy is intrinsically clean, and leads to a combined result for γ of 60(+38-24)°. The errors are still large and a precise measurement of γ is impossible at the B-factories. However, the LHCb experiment at CERN will improve the error on γ to less than 5°, with measurements contributing from the Bu, Bd and Bs sectors.

Year of the Tevatron

Despite the great successes of the B-factories, Beauty 2006 focused on B-physics at hadron machines, and 2006 has been the “Year of the Tevatron”. The CDF and D0 experiments at Fermilab’s Tevatron have not only demonstrated the proof-of-principle of B-physics at hadron machines, but have also made measurements that are highly competitive and complementary to those of the B-factories, in particular through the unique access that hadron machines have to the Bs sector. The results indicate the future at the LHC, where there should be 100 times more statistics.

The highlight of the conference was the first 5 σ observation of Bs oscillations, presented by the. They reported the mass difference between the mass eigenstates, Δms, as 17.77 ± 0.10 (stat) ± 0.07 (syst) ps-1, in agreement with Standard Model expectations. Data from hadronic channels, such as B0s→Dsπ have greatly enhanced the statistical power of the analysis; this measurement relies on the precision vertex detector. The measurement of Δms and Δmd allows the ratio of Cabibbo–Kobayashi–Maskawa (CKM) matrix elements |Vtd|/|Vts| to be extracted with around 5% systematic uncertainty (with input from lattice theory), which fixes the third side of the unitarity triangle with the same precision.

The study of rare processes dominated by loop effects provides an important window on new physics and should have significant contributions from new heavy-particle exchanges. The Tevatron experiments are intensively searching for the very rare decay Bs→μμ, which is expected to have a branching ratio of order 10-9 in the Standard Model, but is significantly enhanced in many supersymmetric extensions. The Tevatron is currently sensitive at the 10-7 level and is striving to improve this reach. The LHC experiments will explore down to the Standard Model value.

Towards the LHC

With the start-up of the LHC, B-physics will enter a new phase. Preparations for the experiments are now well advanced, as are the B-triggers necessary to enrich the sample in signal decays. Talks at the conference described the status of the detectors and their first running scenarios. The LHC pilot run scheduled for late 2007 will yield minimal physics-quality data, but will be invaluable for commissioning, calibrating and aligning the detectors. Researchers will accumulate the first real statistics for physics measurements in summer 2008. Key goals in the first two years of operation will be the first measurement of CP violation in the Bs system; a measurement approaching the Standard Model value (around 2°) of the Bs mixing phase in Bs→J/Ψφ; the likely first observation of the decay Bs→μμ; studies of the B angular distributions sensitive to new physics in the channels Bu,d→K*μμ and precise measurements of the angles α and γ. LHCb will cover a wide span of measurements, whereas ATLAS and CMS will focus on channels that can be selected with a (di-)muon trigger.

Participants at the conference made a strong science case for continued B-physics measurements beyond the baseline LHC programme, to elucidate the flavour structure of any new physics discovered. On the timescale of 2013, the LHCb collaboration is considering the possibility of upgrading the experiment to increase the operational luminosity to 10 times the present design, to accumulate around 100 fb-1 over five years. In addition there are two proposals on a similar timescale for asymmetric e+e “Super Flavour Factories” at luminosities of around 1036 cm-2s-1 – SuperKEKB and a linear-collider-based design (ILC-SFF) – each giving some 50 ab-1 of data by around 2018. The LHCb upgrade and the e+e flavour factories largely complement each other in their physics goals.

Social activities enabled discussions outside of the conference room. Keble College provided accommodation and hosted the banquet at which Peter Schlein, the founder of the conference series and chair for the first 10 meetings, was thanked for his efforts over the years and his pioneering contributions to B-physics at hadron machines.

The conference was extremely lively: B-physics continues to flourish and has an exciting future ahead. The B-factories and the Tevatron have led the way, but there is still much to learn. Heavy flavour results from ATLAS, CMS and, in particular, LHCb seem certain to be a highlight of the LHC era.

Uppsala brings neutrino telescopes back to Earth

IceCube hot-water drill

Neutrino telescopes are the biggest particle detectors. IceCube, currently being built at the South Pole, will have a 1 km3 instrumented volume when complete, and a similar project, KM3NET, is planned for the Mediterranean. Detectors such as AMANDA and the Baikal Neutrino Telescope have reached effective detection areas of tens of thousands of square metres. These huge arrays of photomultiplier tubes buried deep in clear ice or water primarily search the sky for high-energy neutrinos from violent cosmic phenomena, including gamma-ray bursts, active galactic nuclei and supernovae remnants. However, detecting extraterrestrial neutrinos can also provide a unique window on physics beyond the Standard Model of particle physics, the topics ranging from searches for new particles to the effects of extra dimensions.

On 20–22 September 2006 the Department of Nuclear and Particle Physics of Uppsala University hosted the first Workshop on Exotic Physics with Neutrino Telescopes. It focused on physics with neutrino telescopes, beyond astrophysics. The next generation of such detectors will be operational in less than a decade and will push the sensitivity of new physics to levels that can probe many existing theoretical models. At Uppsala we felt that it was timely to provide a forum to summarize the current status and where we can go in the next few years.

Research in underground labs or in accelerators is an important counterpart to searches using neutrino telescopes

Research in underground labs or in accelerators is an important counterpart to searches using neutrino telescopes. The first session reviewed accelerator results on new physics beyond the Standard Model in the post-LEP era, and discussed where the LHC will lead. It also summarized the results and perspectives of searches in underground labs. These searches complement each other, and the understanding of any new effect will need signals observed using different detection techniques to be coherently interpreted.

Searching for dark matter

There were also reviews from the smaller experiments such as MACRO, Super-Kamiokande or the Baksan Neutrino Observatory. During the 1990s, these collaborations provided the first limits on searches for new particles and dark matter, as well as on scenarios for new fundamental physics.

The search for dark-matter candidates is perhaps the most developed of the “exotic” topics covered by neutrino telescopes, both theoretically and experimentally. Particle physics provides several candidates for dark matter in the form of weakly interacting massive particles (WIMPs) that have survived from the Big Bang. The neutralino of the minimal supersymmetric Standard Model (MSSM) is one of them, but the lightest Kaluza–Klein mode, which arises in models with extra space–time dimensions, is also viable. If they exist, such particles should cluster gravitationally as halos in galaxies, and by the same principle accumulate in the centre of heavy objects, such as the Sun or the Earth. If the concentration is high enough, they could annihilate in pairs, producing neutrinos as a by-product. Neutrino telescopes are looking for an excess of neutrinos from the centre of the Sun or the Earth, which would indicate this process. There are competitive limits from the MACRO, Super-Kamiokande, Baksan, Baikal and AMANDA detectors, and experiments have begun to probe MSSM parameter space.

Survival probability

More exotic candidates of dark matter exist as non-topological solitons, or Q-balls. These are coherent stable states of quark, lepton and Higgs fields, and contrary to other WIMPS, they can be heavy, up to 100 TeV. Q-balls can leave a signature in a detector by catalysing proton decay as they pass through – the photomultiplier tubes of neutrino telescopes will record the Cherenkov light of the proton decay products. Another possibility is stable strange-quark matter in the form of nuclearites, with baryon numbers up to 1023, but low values of Z/A, the ratio of atomic number (Z) to atomic mass (A). Such particles could also explain cosmic rays above the Greisen–Zatsepin–Kuzmin (GZK) cut-off, if next-generation air-shower arrays confirm such high-energy particles.

Mini black holes and multi-bangs

The production of mini black holes in the collisions of high-energy neutrinos with the partons in matter nuclei is one manifestation of low-scale gravity. If the centre-of-mass energy of the interaction exceeds the Planck scale, a microscopic black hole can form in the interaction. However, in our 4D world, the Planck scale lies at energies of the Planck mass, around 1019 GeV, while the best man-made accelerators reach only tera-electron-volt energies (103 GeV) in the centre of mass. But in 4+D space–time dimensions the Planck scale may be much lower, and a 1010 GeV neutrino interacting with a nucleus inside the detector could produce a mini black hole. Although this might seem an extremely high energy, such neutrinos should be guaranteed by interactions of the flux of cosmic rays with the all-permeating cosmic-microwave relic photons.

A neutrino telescope could detect the immediate Hawking evaporation of a mini black hole in a burst of Standard Model particles (in around 10-27 s) through the emission of Cherenkov light by the products. There are many free parameters in models with extra dimensions and the uncertainties in the predictions are large. However, up to 10 black-hole events a year could be expected in a 1 km3 detector in the most favourable scenarios, taking into account the existing limits on the ultra-high-energy neutrino flux.

The gravity models at tera-electron-volt energies provide another intriguing possibility: elastic neutrino–parton scattering through the exchange of D-dimensional gravitons. Unlike in black-hole production, the neutrino is not destroyed, and continues on its way ready for another elastic interaction after a mean free path that, for a given energy, depends on the number of extra dimensions. The energy lost in each interaction goes into a hadronic shower, producing a very unusual signature in a neutrino telescope: multiple particle showers without a lepton among them. Current calculations predict that a 1 km3 detector could detect a handful of events each year, probing up to D = 6 extra dimensions.

Tests of fundamental physics

It is now eight years since Super-Kamiokande announced the observation of neutrino oscillations, and this effect continues to be the only established observation of physics beyond the Standard Model. We understand neutrino oscillations as a typical quantum-mechanical superposition effect between propagation (mass) and flavour states. However, there can be other causes of oscillations if certain fundamental physics laws are broken at some scale. These include violation of the equivalence principle (VEP), where the different neutrinos couple differently to the gravitational potential, violation of Lorentz invariance (VLI), where the different neutrinos can achieve different asymptotic velocities giving rise to velocity-induced oscillations, or non-standard neutrino interactions with matter at very high energies.

Fifty physicists from 16 countries attended the workshop

Results from Super-Kamiokande, MACRO and the Sudbury Neutrino Observatory show that, if they exist, such processes are subdominant, and there are limits on their relative strength. However, their dependence on the energy of the neutrino makes such processes interesting for large-scale neutrino telescopes. While the wavelength of standard oscillations is proportional to Eν, in the case of VEP or VLI the oscillation wavelength is proportional to 1/Eν, and neutrino telescopes will provide much better sensitivity, for example by looking for distortions of the angular dependence of the high-energy tail of the atmospheric neutrino flux.

Other contributions to the workshop covered the possibility of explaining trans-GZK cosmic rays as neutrinos with an increased interaction cross-section with matter at ultra-high energies; strongly interacting neutrinos; and how top-down scenarios can produce high-energy neutrinos from the decay products of super-massive Big Bang relics or topological defects. No doubt a discussion on vortons or monopolonia belongs to a workshop on exotic physics.

Fifty physicists from 16 countries attended the workshop. The Ångström laboratory, housing the Uppsala University physics departments and the newest building in one of the oldest universities in Europe, provided a pleasant venue for the meeting.

Viewpoint

I started a company, Research In Motion, while I was still at the University of Waterloo in Ontario, Canada. By the late 1990s we had developed the BlackBerry handheld mobile device. As a result I found myself in a position where I could invest in an area that I am passionate about and one that could make a big difference.

Spot an opportunity. Having observed that research funding is usually thinly spread, I decided to start a theoretical-physics institute that would focus on science that is fundamental to all human progress and at which Canada can excel.

CCEvie1_01-07

Promote scientific openness. My driving motivation for establishing the Perimeter Institute (PI), located next to the University of Waterloo, is that I feel fundamental science needs more support. What worries me is that governments all around the world seem to be listening to the same consultant. They ask scientists to do something that will benefit the economy within five years. Of course, governments are under pressure to balance budgets and be accountable – that’s reasonable. But some of that pressure is getting transferred to universities, with unfortunate results.

Science is a global enterprise based on co-operation and openness. If you say to universities that they must justify their research with patents and licences, you collapse that openness. Efforts to commercialize too early are making researchers more secretive, hampering their ability to excel, without necessarily helping business. I wanted to challenge this trend.

Concentrate on core competencies. A strategic decision we made when creating the Institute in 1999 was to focus on a couple of very specialized fields, quantum gravity and quantum foundations, because we felt these were areas where a relatively small, high-quality team could make a big difference. This is the same strategy that originally made BlackBerry a success: it focused on doing one thing – “push e-mail” – very well rather than competing on all features. So for the first few years, PI focused on recruiting top-class researchers in these two areas to ensure that research efforts were of international calibre within a relatively short period. As the Institute’s reputation builds, we are branching out a bit more.

Build a focal point. The other decision we made early on was to house the Institute in an outstanding building. Before we built it, we spent two years going around the world and talking to people in theoretical-physics institutes and theory departments at universities, asking them what works and what doesn’t. Based on this, we put together some specifications and organized a competition, where we really let the architects go wild. The result is a building with a design that has won several prizes and is internationally recognized.

Attract investment. I invested C$100 million of my own money in PI to get it started. For the longer term it was critical to get government support. Convincing government officials took a huge effort. Part of the challenge is that not many politicians understand basic science, let alone know how to value it. This means that a lot of funding is done almost entirely on your ability to explain the benefits and on their faith in you. Early on, all levels of government (local, provincial and federal) saw the benefit of PI and decided to support the Institute with a total of about C$55 million dollars. More recently, and now that the Institute is established, a further C$50 million in public investment was warmly received.

Present your product. In the long run, you can’t rely on faith alone. So although excellent science is crucial to success that’s really only half of the story. The other half of the Institute’s activities is about outreach. For example, PI has a summer-school programme for students from all over Canada and around the world. PI also goes on tour across Canada to give classroom instruction about physics to both students and teachers.

PI also has a programme of monthly public lectures. Sometimes we’ll have scientists like Roger Penrose discuss a weighty topic; other times we’ll have debates about science with well-known historians and journalists. Waterloo has a population of only about 100,000, yet every month we fill a 550 seat lecture theatre, and there’s always a queue outside on standby. That’s how much interest you can generate in science, if you make the effort to open it up for people and make the research accessible.

And that’s success. Because ultimately, these are the people who vote for the governments which fund the research. If they don’t benefit from and believe in what we’re doing, it’s always going to be an uphill struggle. So in addition to directly helping students, teachers and members of the general public, there’s reason for balancing good science with good outreach. We have to move beyond relying on faith.

Mike Lazaridis is founder and co-CEO of Research In Motion, makers of BlackBerry handheld devices, as well as chancellor of the University of Waterloo. Additional information about PI is available at
www.perimeterinstitute.ca.

Field Theory: A Path Integral Approach (Second edition)

by Ashok Das, World Scientific. Hardback ISBN 9812568476 £45 ($78). Paperback ISBN 9812568484 £28 ($48).

media_9167084

This book describes quantum-field theory within the context of path integrals. With its utility in a variety of fields in physics, the subject matter is primarily developed within the context of quantum mechanics before going into specialized areas. Adding new material keenly requested by readers, this second edition is an important expansion of the popular first edition. Two extra chapters cover path integral quantization of gauge theories and anomalies, and a new section extends the supersymmetry chapter, describing the singular potentials in supersymmetric systems.

Data Analysis: A Bayesian Tutorial (Second edition)

by D S Sivia and J Skilling, Oxford University Press. Hardback ISBN 9780198568315 £39.95 ($74.50). Paperback ISBN 9780198568322 £22.50 ($39.50).

61rofAIJEEL

Statistics lectures can be bewildering and frustrating for students. This book tries to remedy the situation by expounding a logical and unified approach to data analysis. It is intended as a tutorial guide for senior undergraduates and research students in science and engineering. After explaining the basic principles of Bayesian probability theory, their use is illustrated with a variety of examples ranging from elementary parameter estimation to image processing. Other topics covered include reliability analysis, multivariate optimization, hypothesis testing and experimental design. This second edition contains a new chapter on extensions to the ubiquitous least-squares procedure.

Relativity: Special, General and Cosmological (Second edition)

by Wolfgang Rindler, Oxford University Press. Hardback ISBN 9780198567318, £55 ($99.50). Paperback ISBN 9780198567325, £27.50 ($49.50).

092a51405b7085264da0f1c71787e137a95324d2-00-00

Relativistic cosmology has recently become an active and exciting branch of research. Consequently, this second edition mostly affects the section on cosmology, and the purpose remains the same: to make relativity come alive conceptually. The emphasis is on the foundations and on presenting the necessary mathematics, including differential geometry and tensors. With more than 300 exercises, it promotes an in-depth understanding and the confidence to tackle basic problems in this field. Advanced undergraduates and beginning graduate students in physics and astronomy will be interested in this book.

Quantum Mechanics: Classical Results, Modern Systems, and Visualized Examples (Second edition)

by Richard W Robinett, Oxford University Press. Hardback ISBN 9780198530978, £39.95 ($74.50)

4194E+-cxGL

This second edition is a comprehensive introduction to non-relativistic quantum mechanics for advanced undergraduate students in physics and related fields. It provides a strong conceptual background in the most important theoretical aspects of quantum mechanics, and extensive experience of the mathematical tools required to solve problems. It also gives the opportunity to use quantum ideas to confront modern experimental realizations of quantum systems, and numerous visualizations of quantum concepts and phenomena. This edition includes many new discussions of modern quantum systems, such as Bose–Einstein condensates, the quantum Hall effect and wave-packet revivals.

The Goldilocks Enigma: Why is the Universe Just Right for Life?

by Paul Davies, Penguin – Allen Lane. Hardback ISBN 9780713998832, £22.00.

CCEboo1_12-06

The Goldilocks Enigma is the latest in a series of books from the past 20 plus years by physicist, cosmologist and internationally acclaimed outreach expert Paul Davies, covering the often vexed issue of the boundary between science and theology. The central theme of this book is the baffling truism, the so-called anthropic principle, that the universe is surprisingly bio-friendly, consistent with the evolution of life, at least on Earth and possibly elsewhere. Like Goldilocks’s third porridge, the universe seems to be just right for “us”, but why?

Davies guides the reader comprehensively and comprehensibly through the properties and interactions of the components of the universe, small and large, observable and imagined. He presents an equation-free exposé of particle physics and cosmology, from strings to multiverses, and in so doing reveals the wonder of the physical universe. He then augments the “facts” with an impressive sequence of analyses of how and why they came about. But is “our” universe the only one that exists? Is it the only one that can exist? If so, why? If not, what, where and when could other universes be? And does it all point to an Intelligent Designer?

Getting rid of God, numinous, eternal and responsible for all universes at all times, is a popular pursuit for some science communicators these days – Richard Dawkins springs to mind. However Davies is not relentlessly driven to deicide: “You can’t use science to disprove the existence of a supernatural God, and you can’t use religion to disprove the existence of self-supporting physical laws.” This attitude ought to leave many an agnostic armchair physicist patiently waiting for Davies’s next book.

Goldilocks is not always easy to read, but each chapter ends with a helpful shortlist of the important facts and ideas to be retained. A couple of typos and the erroneous statement, appearing twice, that the Large Hadron Collider will collide protons with antiprotons, blemish a text that otherwise bears all the hallmarks of intelligent design.

ALICE forges ahead with detector installation

When it starts up the ALICE experiment will observe collisions of heavy ions in CERN’s Large Hadron Collider (LHC), where “fireballs” of extremely hot and dense matter will be fleetingly made. Up to 20,000 tracks will emerge from each fireball, and one of the challenges for ALICE will be to identify different particles among this veritable “haystack” (in p20). Different elements in the armoury of particle identification for ALICE are now arriving in the experiment’s underground cavern, beginning with the High Momentum Particle Identification Detector (HMPID), which was installed inside the solenoid magnet on 23 September. This was soon followed by the first elements of the Time of Flight (TOF) system and the Transition Radiation Detector (TRD).

CCEnew3_12-06

The HMPID will extend hadron identification in ALICE up to 5 GeV/c, complementing the reach of the other particle-identification systems. It is a ring-imaging Cherenkov detector in a proximity-focusing configuration, which uses liquid C6F14 as the radiator medium, while a 300 nm layer of caesium iodide (CsI) on the cathode of a multiwire proportional chamber converts the Cherenkov photons into electrons. This layer is divided into 161,280 pads, each 8 mm square, which are individually read out by two ASIC chips, GASSIPLEX and DILOGIC, developed with the Microelectronics Group at CERN.

The complete HMPID, realized by Bari University and INFN, CERN (PH-DT1, -DT2 and -AIT groups) and the Institute for Nuclear Research, Moscow, is approximately 8 m wide by 8 m tall, and weighs about 5 t. It comprises seven identical modules shaped to fit against two sides of ALICE’s octagonal magnet. The modules, fully equipped with electronics, were individually transported to ALICE and mounted on a support structure. The complete HMPID was then lowered into the cavern and inserted inside the magnet. Three months of preparation by CERN (PH-DT1 and AIT) and Bari groups, and the help of the CERN transport service, ensured that transport and installation were accomplished within a few hours.

CCEnew4_12-06

With an active area of about 11 m2 covered with CsI, the HMPID is the largest application of this technology. Development began at CERN in the RD26 project, and it took 15 years for the method to reach the current scale and efficiency. The full production of the 42 photocathodes required to equip the detector, from CsI deposition to quality control, was done by the groups at CERN.

The first week of October saw the installation of the first two supermodules for the TOF system, which will be used to identify the thousands of pions, kaons and protons produced in each fireball. Its basic element is a multigap-resistive-plate-chamber (MRPC) strip, with a 120 cm × 7.4 cm active area made of a sandwich of resistive glass sheets (0.4 mm thick) and spacers, with 96 readout pads, each 3.5 cm × 2.5 cm. The full detector, which contains 1638 MRPC strips with a total of 157,248 readout channels, covers a cylindrical surface of about 150 m2 at 3.7 m from the beamline, and weighs 25 t. It is the responsibility of the INFN sections in Bologna and Salerno, in collaboration with the Institute for Theoretical and Experimental Physics, Moscow, and Kangnung National University, Republic of Korea.

The TRD must identify high-energy electron pairs generated in the fireballs. It comprises 18 supermodules that form a cylinder around the large Time Projection Chamber in the central barrel of the ALICE experiment. Each supermodule is about 7 m long and comprises 30 drift chambers in six layers. The construction of the modules is a collaboration between the Universities of Frankfurt and Heidelberg, GSI Darmstadt, the National Institute of Physics and Nuclear Engineering, Bucharest, and the Joint Institute for Nuclear Research, Dubna, with the radiators produced at the University of Munster.

During the summer the drift chambers for the first supermodule were equipped with readout electronics and inserted into the supermodule hull at the University of Heidelberg. After transportation to CERN on 27 September, the module was tested on the surface using cosmic rays before being lowered into the ALICE cavern on 9 October. The final installation took place a day later.

bright-rec iop pub iop-science physcis connect