A recent meeting underlined the continual freshness of studying the ancient history of the universe. Cosmology, which studies the structure of the universe, has been developing rapidly in the past few decades. The number of articles on cosmology, astrophysics and even astronomy in recent issues of CERN Courier testifies to this. Improved observational means and an increased understanding of particle phenomena in the early universe have transformed cosmology from a speculative philosophy into an exact science, yielding diversified knowledge about the laws of nature.
In 1997 particle physicists interested in cosmology recognized the need for a regular workshop dedicated to particle astrophysics and particle cosmology. Particle astrophysics includes, in broad terms, studies of and searches for relic particles and other remnants from the early universe constituting dark matter, as well as neutrino astrophysics, which gives important information about neutrino properties. Particle cosmology deals with particle aspects of the physics of the very early universe: inflation, reheating, cosmological aspects of Grand Unified Theories and strings, baryogenesis, phase transitions and other aspects of symmetry breaking. To complete the picture, particle physicists also need to make contact with purely gravitational issues, with the possible need to rewrite general relativity and with purely astronomical work leading to the determination of cosmological parameters.
The first workshop in the series, COSMO-97, was held in the Lake District, England; COSMO-98 was held in Asilomar, California; COSMO-99 was held in Trieste, Italy; and COSMO-2000 was held in Cheju Island, Korea. This year’s meeting, COSMO-01, was held in Rovaniemi, Finland, right on the Arctic Circle, on 30 August – 4 September. In Finland, research in cosmology started in the early 1980s and today it is one of the most fruitful research fields in the physics department of the University of Helsinki and in the Helsinki Institute of Physics. New impetus and new resources have arrived since Finland joined the European PLANCK project, in which Finnish theoreticians and instrument builders have specific responsibilities.
The Rovaniemi programme comprised 33 invited plenary talks and 63 contributed talks in two parallel sessions. The plenary speakers treated aspects of inflationary cosmology, quintessence cosmology, string cosmology, extra dimensions, the ekpyrotic universe, baryogenesis, Big Bang nucleosynthesis, phase transitions, the angular power spectrum of the cosmic microwave background, large-scale structure, magnetic fields, dark matter, cosmological parameters, neutrino astrophysics and ultrahigh-energy cosmic rays.
The proceedings of the workshop will be published in the SLAC electronic conference proceedings archive and in the Los Alamos arXive astro-ph.
The next workshop in this series will be held on 18-21 September 2002 in Chicago.
The collisions produced by the new generation of high-energy hadron machines – Fermilab’s revamped Tevatron for protons on antiprotons, Brookhaven’s RHIC for heavy ions and CERN’s LHC for protons and heavy ions – create (or will create) lots of secondary particles. The study of these high-multiplicity collisions is a central feature of modern particle physics.
More than 90% of the events in high-energy experiments are inelastic, providing a broad background for more specific processes investigated using the conventional Standard Model. A “new” approach highlights a seldom investigated region of Very High Multiplicity (VHM) physics.
Although producing many particles, these VHM events (region C in the figure) are rare, making up only about 10-7 of the total cross-section at the LHC energy. However, their investigation is thought to be extremely important, not only as background for modern experiments, but also to address fundamental questions.
In principle, all of the collision energy is available to make new particles. However, in practice, most of the secondary particles emerge carrying a lot of kinetic energy, so that less energy is available for the production of additional particles.
What suppresses the production of additional particles?
Confining attention to the small VHM region where many additional particles are created could help to elucidate the question and possibly shed more light on the underlying colour charge confinement phenomenon.
VHM processes may also have played an important role in the evolution of the universe in the immediate aftermath of the Big Bang, because VHM states may be produced only via a high-energy density of initial states. The VHM states have remarkable “calmness” and “coldness”, and they provide a bridge between kinetic energy and rest mass.
An initial approach is to look at multiple hadron production as a thermal dissipation of incident kinetic energy into produced mass, using Schwinger-Keldysh real-time thermodynamics (Manjavidze and Sissakian 2001). This has analogies with previous work by Bogolyubov.
This shows that the domain where complete thermalization is achieved is just the VHM domain. If so, the minority VHM physics may be especially “simple”. However, standard physics approaches are no longer valid for these rare processes. Recent workshops in Dubna and in Varna, Bulgaria focused on these questions.
With so much understanding at stake, it is important that experiments at the LHC, the Tevatron and the RHIC turn their attention to this problem. The third workshop on VHM physics will held in Dubna on 3-6 June 2002.
This year, 5 December marks the centenary of Werner Heisenberg’s birth. It is to him that we owe the first breakthrough of modern atomic theory – the invention of quantum mechanics. His famous uncertainty relations were a central part of its interpretation. He also established several fundamental quantum mechanics applications and pioneered the extension of the theory to high-energy phenomena.
Werner Heisenberg, born in Würzburg, came from an academic family and after 1910 grew up in Munich, where he graduated with distinction from high school in 1920. He studied under Arnold Sommerfeld at the University of Munich, obtaining his PhD in July 1923 and then went on to work under Max Born in Göttingen. In 1924 Niels Bohr invited him to Copenhagen. Thus he became a member of the great international post-First World War community of quantum and atomic theorists, including such brilliant talents as Paul Dirac, Enrico Fermi, Friedrich Hund, Pascual Jordan, Oskar Klein, Hendrik Kramers, Wolfgang Pauli and Gregor Wentzel.
In the very first semester Sommerfeld gave Heisenberg the difficult problem of explaining the anomalous Zeeman effect of sodium spectral lines. The freshman found a perfect solution – exhibiting, however, unusual half-integral quantum numbers and a strangely behaving atomic core. Simultaneously he studied the classical hydrodynamical turbulence problem. On the publication of Heisenberg’s first paper in this field in 1922, Sommerfeld remarked to Heisenberg’s father: “You belong to an irreproachable family of philologists, and now you have the misfortune of seeing the sudden appearance of a mathematical-physical genius in your family.” In his PhD thesis, Heisenberg suggested the first method for deriving the critical Reynolds number, marking the transition from laminar to turbulent motion. In spite of this brilliant work, he nearly failed the experimental part of the doctoral exam with Willy Wien.
The breakthrough
In 1923, contemporary atomic theory was in a deep crisis. As a way out of the situation, Pauli, who was in Copenhagen, and Born and Heisenberg who were in Göttingen, proposed replacing the semiclassical differential expressions of Bohr and Sommerfeld by corresponding discrete difference terms to predict experimental quantum results (the 1925 Kramers Heisenberg formula, which predicted the Raman effect, for example). Heisenberg and Pauli claimed that fundamental concepts of the old theory, notably electron orbits, had to be abandoned completely.
In May 1925, in Göttingen, Heisenberg began to describe atomic systems by observables only (“quantum-theoretical” Fourier series). With this, the usual physical quantities, like position q and momentum p of an electron, did not commute but satisfied instead the relation pq – qp = h/2π. In June 1925 when Heisenberg was recovering from a severe attack of hay fever on the island of Heligoland, he found that he could satisfy the necessary requirement of energy conservation in atomic processes. His “quantum-theoretical reformulation” was the breakthrough to modern quantum mechanics. Soon Born and
Jordan reformulated it as “matrix mechanics” and Paul Dirac as “q number theory”, and applied it successfully, as Heisenberg and Pauli did, to various atomic problems.
It was in 1926 that Erwin Schrödinger created wave mechanics, formally equivalent to matrix mechanics, but working with differential equations and continuous wavefunctions. Schrödinger claimed that nature exhibited no “quantum jumps” at all. Heisenberg, from spring 1926 a lecturer and Bohr’s principal assistant in Copenhagen, contradicted this and in early 1927 derived the central result of the physical interpretation: simultaneous measurements of momentum and position of an atomic particle were limited by the famous uncertainty relation: Dp. Dq ~ h. This relation had radical consequences – the classical causality law or, expressed more generally, the possibility of a strict separation of object and subject, ceased to be valid in quantum science.
In the fall of 1927, Heisenberg became professor of theoretical physics at Leipzig. Together with Peter Debye and Friedrich Hund he established a new centre of atomic physics there. His first students, Felix Bloch and Rudolf Peierls, pioneered with him the quantum mechanics of solids (ferromagnetism, metals and semiconductors).
Heisenberg’s main interest, however, was a relativistic extension of quantum mechanics: with Pauli he formulated Lagrangian quantum field theory (1929). They tried to cope with the emerging divergence difficulties, achieving some progress with “renormalization” procedures (Heisenberg 1934; Weisskopf 1934). Originally they were led to expect that quantum mechanics would not apply any more at high energy. However, after the discovery of the neutron in 1932, Heisenberg proposed a quantum-mechanical theory of the atomic nucleus based on new exchange forces.
During the 1930s, nuclear theory progressed enormously, mainly through work in the US and in Japan (notably by Hideki Yukawa with his meson theory) and further at Leipzig (despite the Nazi government depriving Heisenberg of excellent students and collaborators after 1933).
From 1932 Heisenberg also turned his attention to the high-energy phenomena observed in cosmic radiation. He suggested several new ideas, such as “explosive showers”, and in 1938, with his student Hans Euler, he solved the problem of the so-called “hard component” (unstable “mesotrons”). These efforts aimed ultimately at an ambitious goal that he and Pauli had envisaged: a unified quantum field theory, describing all elementary particles and their interactions, without any divergences and allowing all of their properties (such as masses and coupling constants) to be calculated. More than 30 years later they still had not reached their goal.
However, during their labours, Heisenberg and Pauli created many concepts of modern high-energy physics, such as isotopic spin (Heisenberg, 1932), spin-statistics theory (Pauli and Fierz, from 1937 to 1941), and the symmetry breaking caused by a degenerate vacuum (Heisenberg and Pauli 1958). In addition, in 1942 Heisenberg proposed the so-called “S-Matrix theory”, which was widely discussed after the Second World War as a phenomenological approach in quantum electrodynamics and strong-interaction theory. Another noteworthy result was the logarithmically rising total cross-section for particle collisions at higher energies (Heisenberg 1954).
Science, politics and international relations
During the Third Reich (1933-1945), Heisenberg’s life and work was made difficult not only by racism directed against his Jewish teachers, colleagues and students, but also by outright attacks on him and his scientific work. Nazi partisans considered quantum and relativity theories to be “degenerate, Jewish physics”, the defenders of which “had to disappear like the Jews”. In spite of these attacks, and in spite of generous offers to accept prestigious chairs in the US, Heisenberg remained in Germany, believing that he did not have the moral right to abandon his students and his country during such difficult times.
During the Second World War he was drafted into the secret German atomic energy project, working on a nuclear reactor, but not on a bomb. In 1942 he moved to Berlin to take over the directorship of the Kaiser Wilhelm-Institut für Physik (which eventually became the Max Planck Institute).
After the war he successfully helped to renew science in the Federal Republic of Germany and to re-establish international scientific relations, assisted by many friends in Europe and beyond. Thus he became a co-founder and ardent supporter of CERN (and the first chairman of its scientific policy committee). He considered international co-operation, especially in the most fundamental fields of science (such as high-energy physics), to be a “main tool to reach understanding between peoples”. As president of the Alexander von Humboldt Foundation, he invited hundreds of young research scholars from all around the world to work at German universities and scientific institutes, and high-energy physics received a substantial share of these fellowships.
Werner Heisenberg died on 1 February 1976 in Munich. To commemorate his 80th anniversary, the Max Planck Institute for Physics (which he had transferred in 1958 from Göttingen to Munich) was given the additional name “Werner-Heisenberg-Institut”.
The centenary is being marked by several special events. From 26-30 September a meeting with the title “100 years of Werner Heisenberg” was held by the Alexander von Humboldt Foundation at Bamberg; from 4-7 December a Heisenberg centennial event at the Max Planck Institute and Ludwig-Maximilians University, Munich, includes a two-day symposium with nine distinguished speakers from abroad; and from 3 December to January 2002 there is a Heisenberg exhibition at the University of Leipzig and at the Max-Planck-Haus, Munich.
One hundred years ago, some physicists began to suspect that electromagnetic radiation was packaged – or
“quantized” – rather than being a continuous stream. This followed Max Planck’s discovery that the spectrum of light from a hot object could be explained only if the radiators sat in discrete energy states. By 1905 Albert Einstein concluded that the radiation itself was emitted as bursts of energy – light quanta – later to be called photons. Einstein’s key explanation earned him the 1921 Nobel Prize for Physics.
In 1924 Satyendra Nath Bose from Dacca University, in what was then India, wrote to Einstein asking for his help in getting a paper published. Bose had already sent it to the Philosophical Magazine, where it had been turned down. The paper showed how Planck’s distribution law for photons could be derived from first principles. Duly impressed, Einstein translated it into German, and the paper was published in 1924 in Zeitschrift für Physik.
As a result, Einstein temporarily turned away from his dogged but unsuccessful search for a unified theory of gravitation and electromagnetism and started work on the quantum theory of radiation. Thus was born the concept of “Bose-Einstein” statistics for quanta (“bosons”) carrying an integer value of intrinsic angular momentum (spin). There is no limit to the number of bosons that can simultaneously occupy any one quantum state.
Einstein noted that if the number of such particles is conserved, even totally non-interacting particles should undergo a change of behaviour at low enough temperatures – Bose-Einstein condensation. Bose had not predicted this because he was looking at photons, which can simply disappear when the energy of the system is decreased.
The condensation that Einstein predicted derives from the fact that the number of states available at very low energy becomes exceedingly small. With less and less room for all of the particles when the temperature is decreased, they accumulate (condense) in the lowest possible (ground) energy state.
Superbehaviour and its effects
Even before this was going on, the liquefaction of helium by cryogenic pioneer H Kamerlingh Onnes (1913 Nobel Prize for Physics) opened up new areas of physics study. Materials cooled by liquid helium to within a few degrees of absolute zero showed bizarre behaviour. However, it took time for the real nature of these effects, which are now known as superconductivity and superfluidity, to become clear. Superconductivity is the virtual disappearance of electrical resistance at liquid-helium temperatures, and superfluidity is the virtual disappearance of viscosity as we know it. Superfluid helium flows without resistance, as if no internal frictional forces act in the liquid. (Appropriately, these properties are being exploited to the full in the cryogenics for CERN’s new LHC collider, the superconducting magnets of which will be cooled by superfluid helium at 1.9 K).
In 1938 Fritz London suggested that superfluidity could be caused by the bosonic condensation of helium-4 atoms, which have integer spin. This was supported by the fact that no similar effect had then been seen with the rarer helium-3 isotope, the atoms of which do not have integral spin (see below).
In the 1950s, O Penrose and L Onsager related superfluidity to the long-range order displayed by a highly correlated bosonic system. This gave an estimate of the amount of condensed atoms in the liquid – only about 8%, because strong interactions in liquid helium make it deviate significantly from the ideal non-interacting gas.
Superfluid helium flows without resistance, as if no internal frictional forces act in the liquid. This was explained by a phenomenological theory devised by L D Landau in 1941, eventually earning him the 1962 Nobel Prize for Physics. In this theory, superfluidity derives from the fact that when the available energy is low enough, only long-wavelength phonons (the vibration quanta of the medium) can be excited.
Electron pairs
Although superconductivity was first seen in 1911, reaching a full theoretical explanation took nearly 50 years. In 1957 J Bardeen, L N Cooper and J R Schrieffer (“BCS”) proposed a theory based on phonon-mediated interactions between the electrons of the metal. This earned them the 1972 Nobel Prize for Physics.
The BCS theory showed that superconductivity is due to strong correlations between electrons of opposite spin. This creates a highly coherent state that is insensitive to perturbations, hence the lack of electrical resistance. Electron pairs can be considered as bosonic particles and the superconductivity transition is similar to Bose -Einstein condensation.
Earlier, V L Ginsburg and Landau had suggested a phenomenological theory. Although the implications of this approach emerged only slowly, it did lead to new developments in spontaneous symmetry breaking, which turned out to be crucial for particle physics in what is now known as the “Higgs mechanism”.
Helium-3 atoms, which have half-integer spin, are not bosons and should not condense like helium-4 to become superfluid. However, in the same way that electron pairs make materials superconducting at low temperatures, the BCS mechanism also opens up the possibility of superfluid helium-3. The discovery of superfluid helium-3 earned the 1996 Nobel Prize for Physics for David Lee, Douglas Osheroff and Robert Richardson.
Pairing effects, this time between nucleons rather than electrons, can also play a role in nuclear physics.
The ultimate candidates for Bose-Einstein condensation were atoms. However, the experimental challenges were formidable and had to await the development of suitable trapping and cooling techniques to confine and groom the atomic states.
In 1995, some 70 years after Einstein’s original prediction, those who went on to earn this year’s Nobel laureates succeeded in achieving this extreme state of matter. Cornell and Wieman produced a pure condensate of about 2000 rubidium atoms at 20 nK. Independently, Ketterle performed experiments with sodium atoms. His condensates contained more atoms and could therefore be used to investigate the phenomenon further. Making two separate condensates merge into one another, he obtained very clear interference patterns, showing that the condensate contained coherent atoms. Ketterle also produced a stream of small drops that fell under the action of gravity – a primitive “laser beam” using matter rather than light.
To achieve the very low temperatures needed for Bose-Einstein condensation, physicists have to exploit laser cooling, in which atoms lose energy via the continued absorption and emission of photons of radiation. Steven Chu, Claude Cohen-Tannoudji and William Phillips were awarded the 1997 Nobel Prize for Physics for their development of these techniques.
Since these pioneer experiments, Bose-Einstein condensation has been achieved in a variety of chemical elements. One of the latest developments is a Bose-Einstein condensate on a microelectronic chip. These achievements are a tribute to the ingenuity and perseverance of experimenters, and they demonstrate the subtle interplay of many new scientific techniques.
Satyendra Nath Bose was born in Calcutta, the son of a railway worker. An outstanding physics student, he also had a talent for languages and translated milestone physics material from French and German into English for local publication. One of his efforts was a text by Einstein on General Relativity, the English-language rights for which had meanwhile been acquired by a London publisher. At Bose’s request, Einstein himself intervened and allowed the Bose translation to be used inside India.
It was this episode that gave Bose, working in Dacca, the confidence to approach Einstein again in 1924 with the new derivation of the Planck radiation law: “Respected Sir, I have ventured to send you the accompanying article for your perusal and opinion.” Einstein was impressed: “The Indian Bose has given a beautiful derivation of Planck’s law.” Soon physics history was made. Bose and Einstein met in Berlin in 1925. Bose returned to Dacca and in 1945 moved to Calcutta, where he spent the remainder of his career.
His name is now enshrined in physics. A “boson” is a particle of integer spin that obeys Bose-Einstein statistics and is the counterpart of a “fermion”, which is a particle of half-integer spin that obeys Fermi-Dirac statistics. Unlike Dirac, Einstein and Fermi, Bose did not achieve a Nobel prize. However, in 1930 Venkata Raman, also of Calcutta, earned the Nobel Prize for Physics for the light-scattering effect that bears his name. He was the first scientist from outside Europe and the US to earn the coveted award.
The Nobel century
In 1901 the first Nobel prize award ceremony was held at what is now called the Old Royal Academy of Music in Stockholm. In Christiania (now Oslo), the names of the Nobel laureates were announced in the Storting (the Norwegian Parliament – in 1905 Norway reverted to being a separate monarchy).
The winner of the first Nobel prizes were: physics – Wilhelm Röntgen for his discovery of X-rays; chemistry – J H van’t Hoff for his work on chemical dynamics; medicine and physiology – Emil Adolf von Behring for his work on serum therapy, especially its application against diphtheria; literature – Sully Prudhomme (René François Armand Prudhomme); peace – Jean Henri Dunant, founder of the Red Cross in Geneva, and Frédéric Passy, founder of the first French peace society.
To commemorate the centennial of the first Nobel awards, all of the living laureates have been invited to participate in a Centennial Week in December. Beginning with lectures at various universities around Sweden and Norway, the week culminates with the Nobel festivities on 10 December in Stockholm and Oslo. Several hundred laureates are expected to participate in the event.
It all began 1.8 billion years ago when geologists believe that a meteorite struck the Earth, creating what is now the Sudbury basin in Canada. The impact allowed a rich seam of nickel-copper ore to rise through the Earth’s crust around the rim of the crater. Today the Sudbury basin is circled with the world’s largest concentration of nickel mines and in one of them, scientists accompany the miners on their morning descent to the 6800 ft (2000 m) level.
The Sudbury landscape still has an unearthly quality about it. Early mining efforts stripped away trees to provide fuel for smelting the ore, with the result that in the 1960s the Sudbury basin resembled a moonscape. NASA even sent moonshot astronauts there for training. Today the trees are coming back, thanks in part to the mines themselves, where underground nurseries provide warm stable conditions for trees to grow. “All you have to add is light,” said Art McDonald, director of the Sudbury Neutrino Observatory (SNO) as we stepped off the lift 2000 m underground. Here the rock is constantly at a temperature of 40°C, making for a sticky 1.5 km walk along the “SNO drift” – the tunnel connecting the mine shaft to SNO’s underground laboratory.
Cleanliness is the key
Visiting SNO is an adventure in itself. Scientists and miners are indistinguishable in all but their conversation as they descend in the lift. Overalls, miners’ lamps and safety harnesses are the order of the day. Everything to be taken into the lab must be carefully wrapped in plastic to protect it from the omnipresent mine dust. Arriving at the lab, boots are rinsed down, clothes are removed and everyone takes a shower before changing into a clean set of overalls and entering the lab.
Scrupulous attention to cleanliness is one of the keys to SNO’s success. Incredibly, the laboratory maintains class-100 clean-room conditions in the most sensitive areas and all areas are class-3000 or better. That means that everywhere within the laboratory there are fewer than 3000 particles of 1 µm or larger per 1 m3 of air. A typical room would give a count of around 100 000 particles and the SNO drift considerably more. Even more impressive is that these clean conditions were maintained throughout the construction of the experiment.
The emphasis on purity does not end with the air. Systems for purifying the SNO detector’s light and heavy water fill most of the available space. The 33 m deep, 22 m diameter chamber that houses the detector is lined with several layers of plastic material that help to keep the radiation level from uranium and thorium a full nine orders of magnitude lower than in the surrounding rock.
SNO began collecting data in 1999, but its history goes back much further. In 1984 Herb Chen of the University of California at Irvine first pointed out the advantages of using heavy water as a detector for solar neutrinos. Two reactions – one sensitive only to electron-type neutrinos, the other sensitive to all neutrino flavours – would allow such a detector to measure neutrino oscillations directly. The Creighton mine in Sudbury – among the deepest in the world – was quickly identified as an ideal place for Chen’s proposed experiment to be built and the SNO collaboration held its first meeting in 1984.
There were substantial obstacles to overcome before the experiment could be realized, not least of which was the cost of the heavy water. It was clear from the start that industrial partners would have to be found. INCO, the company operating the Creighton mine, became a key player, putting its infrastructure at SNO’s disposal and blasting out a new cavern for the experiment far away from ongoing mine activity. Another key partner was found in the form of Atomic Energy of Canada Limited, which provided C$330 million of heavy water on loan, free of charge. “In a sense we’re doing a greater than C$600 million project for less than C$100 million in terms of capital cost,” explained McDonald.
The experiment was approved in 1990. Excavation took three years and installation a further five. The detector consists of a 12 m diameter acrylic sphere containing 1000 tonnes of heavy water surrounded by light water and viewed by 10 000 photomultipliers. Filling the sphere with heavy water, flooding the cavern with light water and calibrating the detector was complete by November 1999, allowing data taking to begin.
In its first phase of running – to June 2001 – SNO’s analysis concentrated on the measurement of boron-8 electron neutrinos from the Sun. These are detected at SNO via the charged current process of electron neutrinos interacting with deuterons to produce two protons and an electron. First results published in 2001, taken together with Superkamiokande’s previous measurement (The Sudbury Neutrino Observatory confirms the oscillation picture) via the elastic scattering of boron-8 neutrinos from electrons with low sensitivity to neutrino types other than electron neutrinos, provide compelling evidence for neutrino oscillation.
The next step for SNO was to measure the total boron-8 neutrino flux to give a complete measurement that is independent of the Superkamiokande result. To do this, salt has been added to the heavy water. Salt increases SNO’s sensitivity to the flavour-blind process of neutral current neutrino-deuteron interactions, which are identified by the detection of the photon emitted when the deuteron’s neutron is captured. Capture on heavy water results in a 6.25 MeV photon, whereas capture on chlorine releases an 8.6 MeV photon that is more easily detected. Moreover, the neutron capture probability in SNO’s heavy water is around 25%, whereas in salt it rises to 85%. Radioactivity levels are also low for this phase of the experiment and data analysis is under way.
In a third phase of running, scheduled to begin in the second half of 2002, the salt will be removed and replaced by helium-3-filled proportional counters. These will give the experiment an independent sensitivity to the neutral current process and allow distortions in the solar boron-8 spectrum to be measured more accurately than before.
Supernovae warning
Solar neutrinos form just one strand of SNO’s research programme. The experiment’s ability to single out electron-neutrino interactions and its high sensitivity to other neutrino types gives it a powerful tool for investigating supernovae by observing the time development between different neutrino types emerging from the explosion. SNO’s data-acquisition system, normally running at around 10 Hz, is set up to buffer several hundred events in a window lasting just a few seconds if necessary, and it also alerts the shift crew whenever the event rate rises significantly. This initiates an analysis procedure, designed to identify whether noise or physics is responsible for the rise. SNO will be part of the Supernova Early Warning System (SNEWS) along with the LVD (Gran Sasso), Superkamiokande (Japan) and Amanda (South Pole) experiments. Signals sent to a central computer in Japan can be studied for time coincidences and the astronomical community can be alerted in the case of a supernova.
The neutrino burst can precede light by several hours. The detector’s location 2000 m below a flat surface also makes it a particularly powerful instrument for observing neutrinos created via cosmic-ray interactions in the atmosphere. In contrast with detectors under mountains, SNO has a 45° window for measuring downward-moving neutrinos. A clear distinction between downward and upward-moving neutrinos will allow SNO to make a model-independent measurement of atmospheric neutrinos over a three- to four-year timescale. SNO has a well defined programme until 2006 and ambitious plans thereafter. The scientists envisage a shift in emphasis towards more subtle neutrino physics and possible improvements to the SNO detector. Seasonal variations and correlations with the solar cycle are on the agenda. SNO will also turn its attention to other neutrino oscillation processes in the Sun.
Canadian scientists are hopeful of extending the laboratory beyond the one experiment that it currently houses. The Canadian government has recently launched the Canadian Foundation for Innovation International Programme to generate world-class international research facilities in Canada, and Sudbury is a strong contender. Having passed the first round of selection, the laboratory has been invited to submit a detailed proposal by February. Under this C$30 million plan, the Sudbury site would acquire a new experimental hall to house at least two new experiments. Final selection is scheduled for June 2002.
Herb Chen didn’t live to see his brainchild realized. He died in 1987, but his presence at Sudbury is still very strongly felt. Copies of his 1984 Physical Review Letters paper hang proudly around the laboratory and his portrait graces the entrance. SNO has put Sudbury firmly on the physics map, but it hasn’t lost sight of its roots. “The SNO team is working very hard to accomplish the full physics objectives while maintaining Herb’s memory as a constant inspiration,” explained McDonald.
High-energy and nuclear physicists, astronomers, astrophysics specialists, astroparticle physicists and ministry representatives all provided input for a recent workshop on astroparticle physicists sponsored by the German Ministry for Education and Research (BMBF). Such cross-disciplinary input sparked many interdisciplinary discussions and was immediately fruitful.
After initial special lectures for students, the science talks proper reviewed the cosmic background radiation at all wavelengths, presented by D Lemke from the Max-Planck Institute (MPI) for Astronomy in Heidelberg; the evidence for dark matter in the universe (P Schneider, Bonn); properties of black holes (H Falcke, MPI for Radioastronomy, Bonn); the connection between astroparticle physics and particle physics (A Ringwald, DESY); and cosmology with new experiments (F Aharonian, MPI for Nuclear Physics, Heidelberg, and G Sigl, Institut d’Astrophysique de Paris).
Other talks focused on currently running or planned experiments. Now widely exploited is the technique of imaging atmospheric Cherenkov-light telescopes (IACTs), in which the primary cosmic photon is measured by the Cherenkov light emitted by the photon-induced particle shower in the Earth’s atmosphere.
In the past 10 years such experiments have progressed from prototypes to precision tools. The results are puzzling. IACTs were originally aimed at finding where galactic charged cosmic rays are accelerated, but to date no convincing acceleration sites have been pinned down. However, totally unexpected, highly variable, extragalactic sources have been discovered that emit photons with energies of up to at least 16 TeV (G Heinzelmann, Hamburg).
The detailed energy spectra obtained are difficult to understand and may conflict with our present understanding of intergalactic background radiation fields. New experiments (W Hofmann, MPI for Nuclear Physics, Heidelberg, and E Lorenz, MPI for Physics, Munich) with significantly improved sensitivities will start data taking in 2002 and are bound to find more surprises. Their energy threshold will be lowered to match the upper end of the energy range accessible by satellites (G Kanbach, MPI for Extraterrestrial Physics, Garching) so that the last energy gap for observational astrophysics with high-energy photons will be closed.
While the IACTs are fighting the air showers induced by charged cosmic rays as unwanted background, other experiments concentrate on these showers to understand the origin of cosmic rays. The ambitious aim is to collect detailed data on the energy spectrum and the mass composition. The most interesting of the energy regions are the “knee”, around 1016 eV where the index of the energy spectrum suddenly changes, and the highest energies above 1019 eV.
Due to the very low cosmic-ray flux at these energies, direct measurements by balloons or satellites (M Simon, Siegen) are not possible. The only experimental approach realized so far uses extended ground-based installations catching data of the induced air showers. Physicists encounter several challenges. One is the result of the absence of accelerator-based measurements of particle interactions in the relevant kinematic regions. Not only must the astrophysical questions be answered, but also reliable interaction models to simulate the air shower developments must be worked out in parallel.
K-H Kampert (Karlsruhe) presented the progress around the “knee” where detailed multiparameter analyses may soon allow firm conclusions to be drawn. At the highest energies, new, extended experiments will provide large event numbers (H Blümer, Research Centre, Karlsruhe) and tackle the mystery of the cosmic rays beyond the Greisen-Zatsepin-Kuzmin cut-off energy near 1020 eV.
Our everyday world is made up of atoms in which protons and neutrons are packed together by the strong nuclear force into a tiny core of positive electric charge, surrounded by a sparse cloud of “orbiting” electrons, these carrying an equal and opposite negative electric charge.
The electron orbits are hundreds or thousands of times larger than the nucleus. Seen from an orbiting electron, the central nucleus looks far away and rather structureless, in the same way that the Sun appears to us as a distant homogeneous sphere.
By making artificial atoms in which electrons are replaced by other, heavier particles that pass very close to the nucleus, physicists are able to get a close look at the centre of the atom in the same way that space probes, such as the SOHO satellite, see a very different picture when approaching the Sun’s surface.
Being electrically charged, protons can be mapped by probing the nucleus with charged particle beams, like electrons. Neutrons are more difficult to map, especially at the outer, less dense edges of the nucleus. Over the years, experiments using a variety of techniques have suggested the existence of a neutron “halo” – a sort of uniform nuclear stratosphere, relatively isolated from the “weather” at the centre of the nucleus. An experiment at CERN has given fresh support to this idea.
Many of the particles commonly produced by high-energy beams can carry negative charge. Examples are the muon, the pion, the kaon, hyperons and the antiproton. Normally these particles travel so fast that they tear past target atoms, ripping out electrons in their wake. However, as the particles lose energy and slow down, they can eventually reach a point where they knock out an electron for the last time and become captured by the electric field of the neighbouring nucleus, thus forming an “exotic” atom.
In such an atom the intruder orbital particle is much heavier than the electron that it has replaced, so its orbit is consequently smaller. A muon, for example, is 200 times as heavy as an electron and is able to pass correspondingly closer to the nucleus. However, a muon, like an electron, does not feel the strong nuclear force, even at very close distances.
Strongly-interacting particles such as the pion, the kaon, hyperons and the antiproton do feel the strong force of the nucleus. In addition, the strongly interacting particles are heavier still (an antiproton being 2000 times heavier than an electron) so that they can get very close to an atomic nucleus. Exotic atoms are therefore a good laboratory for studying the periphery of the nucleus.
In 1988, CERN’s European Muon Collaboration (EMC) experiment stunned the world of physics with the announcement that some of the nucleon’s spin was missing. More than a decade later, physicists are still trying to account for the missing spin and the spotlight has moved to the HERMES experiment at Hamburg’s DESY laboratory.
A close look inside the nucleon
The closer you look, the more you see. That may sound like a common sense maxim, but when it comes to nucleons it takes on an interesting twist. In a simple quark-parton model, a nucleon is made up of three quarks and it is the spins of these quarks that are supposed to give the nucleon its spin. That simple notion was disproved when the EMC experiment gave rise to what was quickly dubbed the nucleon spin crisis, announcing that quarks could account for only around 20% of a nucleon’s spin at most.
The EMC result led to several experiments taking a much closer look at what goes on inside nucleons. The three quarks that give a nucleon its identity the valence quarks – are just the beginning of the story. They swim in a “sea” of virtual quarks and antiquarks that are constantly popping in and out of the vacuum. Moreover, gluons flit about inside nucleons, holding the quarks together. All of these can contribute to a nucleon’s spin, and their constant movement generates an intrinsic angular momentum of the nucleon as a whole.
A succession of experiments to pin down the effect ensued at CERN (the NMC and SMC collaborations using muon beams) and at SLAC (with electron beams), and by the late 1990s the spin crisis had transformed into a puzzle: that of finding out how the nucleon’s spin was distributed among its various contributing factors. The CERN and SLAC experiments had confirmed, with greatly improved precision, the original EMC finding that quarks alone could not be responsible for the nucleon’s spin. The next task was to measure the contributions of the individual quark flavours and of gluons. The baton passed to DESY, whose HERA measurement of spin (HERMES) experiment had started to collect data in 1995.
HERMES uses the polarized (spin-oriented) positron or electron beam of DESY’s HERA collider incident on a polarized gas-jet target. This, coupled with the HERMES detector’s powerful particle identification capability, has allowed the collaboration to measure precisely the contributions to nucleon spin of each valence quark flavour. HERMES’ results are in perfect agreement with earlier results and show that the “up” valence quarks spin the same way as the nucleon as a whole, while the “down” valence quarks spin the opposite way.
HERMES has also published a first estimate of the gluon contribution to a nucleon’s spin. HERA’s lepton probes do not see electrically neutral gluons directly, so this is a difficult measurement to make. HERMES has achieved it by exploiting the process of photon-gluon fusion. When an electron scatters from a quark in a nucleon, it does so through the exchange of a virtual photon, and this photon can interact with gluons as they dissociate into quark-antiquark pairs.
In HERMES’ first phase of running, from 1995 to 2000, the experiment showed that gluons do indeed contribute to a nucleon’s spin, spinning in the same direction as the nucleon. Future analyses will use the HERMES detector’s particle identification capabilities to quantify the gluon contribution by studying processes involving charm quarks, which give a precise handle on gluons.
An experiment at Brookhaven’s Alternating Gradient Synchrotron (AGS) has “mass-produced” doubly strange hypernuclei – exotic nuclei in which two neutrons have been replaced by L hyperons (each containing three quarks – up, down and strange).
In the AGS experiment by a collaboration of physicists from Canada, Germany, Japan, Korea, Russia and the US, a beam of negative kaons (negative charge, strangeness minus 1) produced outgoing positive kaons (positive charge, strangeness plus 1).
In these reactions an incoming negative kaon hits a nuclear proton, transforming it into a negative X particle (strangeness minus 2 – consisting of a down quark and two strange quarks). This in turn transforms into a pair of L particles, each of which can lodge in the parent or a nearby nucleus, and sometimes even in the same nucleus, identified as hydrogen-4LL, containing a proton, a neutron, and two Ls. The doubly strange hypernucleus thus resembles a deuteron onto which two Ls have been grafted. The nuclear states are identified via their weak (beta) decay patterns into pions, each pion being associated with a unit of strangeness change.
Such doubly strange hypernuclei have been reported before singly, but never in such quantities. Their production opens the door to the study of other such nuclei and of the mutual interaction of L particles. The substitution of a lambda for a neutron upsets the usual nuclear shell assignations due to the Pauli Exclusion Principle, so that the strange particles can penetrate to the nuclear core.
The simplest hypernuclei, containing a single strange particle, have been known and studied for almost half a century.
“Returning to my Ann Arbor attempts, I became immediately very eager to see how far the mentioned analogy reached, first trying to find out whether the Maxwell equations for the electromagnetic field, together with Einstein’s gravitational equations, would fit into a formalism of five-dimensional Riemann geometry.”
Oskar Klein, “From my life of physics” in the anthology From a Life of Physics (1989 World Scientific).
The inaugural conference of the Michigan Center for Theoretical Physics (MCTP), entitled 2001: a Spacetime Odyssey, was held in Ann Arbor on 21-25 May. In keeping with the MCTP’s mission to provide a venue for interdisciplinary studies in theoretical physics and related mathematical sciences, the conference brought astronomers, cosmologists, particle physicists and mathematicians together to share their different perspectives on space-time at the beginning of the 21st century.
Two theories revolutionized the 20th-century view of space and time: quantum mechanics and Einstein’s General Theory of Relativity. The union of these theories provides a context for elem-entary particle theories with extra space-time dimensions, such as the inflationary model of Big Bang cosmology; dark matter and dark energy in the universe; radiation from quantum black holes; and the fuzzy space-time geometry of superstrings and M theory. These developments, which are derived in part from the 19th-century mathematics of Riemannian geometry and Lie groups, have in their turn inspired new directions in the pure mathematics of topology and knot theory. All of these different strands of the space-time story were represented at the conference.
The dawn of space-time
Ann Arbor was a particularly appropriate place for such a celebration because it was here that Oskar Klein first came up with the idea of extra space-time dimensions and of using Riemmanian geometry to explain not only gravity but also electromagnetism.
The conference began with “My life as a boson”, a historical recollection by Peter Higgs (Edinburgh) on the discovery of the particle that bears his name. Higgs described the history leading up to his 1964 paper, the controversy over the Goldstone theorem, and some interesting reactions to his paper. Only after Physics Letters had rejected his original version did Higgs produce a revised text predicting the existence of the new particle, which was then accepted by Physical Review Letters. He ended his talk with the hopeful mention of “indications of H at about 115 GeV” at CERN’s LEP electron-positron collider.
Joseph Silk (Oxford), Robert Kirshner (Harvard-Smithsonian Center for Astrophysics), Alan Guth (MIT), Paul Steinhardt (Princeton), Andre Linde (Stanford), Wendy Freedman (Carnegie Observatories) and Michael Turner (Chicago) each offered different perspectives on the universe. While new data pinpoint cosmological parameters with more and more accuracy, and evidence for the current acceleration of the expansion of the universe and the mysterious “dark energy” now seems convincing, controversy over their origins continued to swirl with Steinhardt’s presentation of the novel “ekpyrotic” alternative to inflation. Inspired by the Horava-Witten picture of an 11-dimensional M-theory universe sandwiched between two 10-dimensional boundaries, this scenario ascribes the Hot Big Bang to a smashing together of two three-dimensional surfaces (three-branes) in a five dimensional universe: the Big Crash.
Linde championed the old inflation with a vigorous deflation of Steinhardt’s ideas. Freedman then reported a new result – the first measurement of extragalactic background light, which turned out to be about twice that of individual galaxies. The mass associated with starlight was thus measured to be about 1% of critical density. Guth emphasized the evidence for inflation and its robustness.
Shing-Tung Yau (Harvard), Isadore Singer (MIT), Arthur Jaffe (Harvard) and Bruno Zumino (Berkeley) gave us a glimpse into the realm of higher mathematics and its intriguing applications to problems in quantum field theory, superstrings and M-theory. For Singer, this was a sentimental journey, and he recalled his arrival as an undergraduate in physics at the University of Michigan in 1941. Checking into his dormitory, he was told to go upstairs and make his bed. Sure enough, he discovered a hammer, some nails and a pile of two-by-four planks where his bed was supposed to have been.
Quantum gravity was the common theme of talks by James Hartle (UC Santa Barbara), Jacob Beckenstein (Jerusalem) and Stanley Deser (Brandeis). Hartle discussed the reconciliation of Einstein’s general relativity with quantum mechanics, including the many-worlds interpretation, wormholes in space and the possibility of time travel. Beckenstein presented the case for discrete energy levels for quantum black holes as a way of understanding their thermodynamics, which was pioneered by him and Hawking in the 1970s; and Deser gave an overview of gravity’s century, from Einstein to M-theory.
Discoveries overlooked
Particle phenomenology was covered by Lev Okun (ITEP, Moscow), John Bahcall (Princeton IAS), Helen Quinn (SLAC), Mary K Gaillard (Berkeley), Paul Frampton (North Carolina, Chapel Hill) and Martinus Veltman (Michigan). Okun recalled his early days in Moscow, the discoveries made there, such as the calculation of asymptotic freedom by Khriplovich in 1969 and the reaction (or lack thereof) in the West. He also recalled Sakharov’s reaction to the possibility of vacuum bubble creation by colliders: “Such theoretical work should be forbidden.”
Bahcall presented an overview of our understanding of solar neutrinos and described how upcoming data, particularly from SNO would sharpen it (as indeed happened soon after the conference)
Quinn reviewed the theoretical status of CP violation and current attempts to test it experimentally, while Frampton presented a model of spontaneous CP violation. Gaillard put the case for a derivation of the Standard Model from the compact-ification of the 10-dimensional heterotic string on a six-dimensional Calabi-Yau manifold. Veltman was broadly critical of current research directions, casting doubt on black holes, the acceleration of the universe and superstring theory, although he was positive about the construction of the superconducting Tesla linear electron collider. <textbreak=Superstrings>Superstring theory and its successor, M-theory, were covered by John Schwarz (Caltech) and Alexander Polyakov (Princeton). Schwarz re-viewed the phenomenon of anomaly cancellation from the Standard Model to the superstring, while Polyakov discussed the Plato’s cave “holographic” universe, in which our four-dimensional space-time is a boundary of a five dimensional anti-de Sitter space.
One of the missions of the MCTP is to convey the excitement of physics to the general public. The 2001: A Spacetime Odyssey conference achieved this very successfully with the collaboration of the Physics Department and the School of Art and Design at the University of Michigan. Ten new works by professional artists, with inspiration from physicists, were created for exhibition at the conference.
Most speakers agreed that participating in such a meeting had broadened their horizons, and perhaps even influenced their work. In his after-dinner speech at the conference banquet, Sheldon Glashow (Boston) assessed the current state of particle and astrophysics, and whether society cared enough to sustain it. After a good meal and fine wine, the future looked rosy.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.