Comsol -leaderboard other pages

Topics

Those were the days: discovering the gluon

CCglu1_06_09

In the mid-1970s quantum chromodynamics (QCD) was generally referred to as the “candidate” theory of the strong interactions. It was known to be asymptotically free and was the only plausible field-theoretical framework for accommodating the (approximate) scaling seen in deep-inelastic scattering, as well as having some qualitative success in fitting the emerging pattern of scaling violations. Moreover, QCD could be used to explain qualitatively the emerging spectrum of charmonia and had some semi-quantitative successes in calculating their decays. No theorist seriously doubted the existence of the gluon but direct proof of its existence, a “smoking gluon”, remained elusive.

In parallel, jet physics was an emerging topic. Statistical evidence was found for two-jet events in low-energy electron–positron annihilation into hadrons at SPEAR at SLAC, but large transverse-momentum jets had not yet been observed at the Intersecting Storage Rings, CERN’s pioneering proton–proton collider. There, it was known that the transverse-momentum spectrum of individual hadron production had a tail above the exponential fall-off seen in earlier experiments, but the shape of the spectrum did not agree with naive predictions that were based on the hard scattering of quarks and gluons, so rival theories – such as the constituent-interchange model – were touted.

The three-jet idea

This was the context in 1976 when I was walking back over the bridge from the CERN cafeteria to my office one day. As I turned the corner by the library, it occurred to me that the simplest experimental situation to search directly for the gluon would be through production via bremsstrahlung in electron–positron annihilation. Two higher-energy collider projects were in preparation at the time, PETRA at DESY and PEP at SLAC, and I thought that they should have sufficient energy to observe clear-cut three-jet events. My theoretical friends Graham Ross, Mary Gaillard and I then proceeded to calculate the gluon bremsstrahlung process in QCD, demonstrating how it would manifest itself via jet broadening and the appearance of three-jet events featuring the long-sought “smoking gluon”. We also contrasted the predictions of QCD with a hypothetical theory based on scalar gluons.

I was already in contact with experimentalists at DESY, particularly my friend the late Bjørn Wiik, who shared my enthusiasm about the three-jet idea. Soon after Mary, Graham and I had published our paper, I made a trip to DESY to give a seminar about it. The reception from the DESY theorists of that time was one of scepticism, even hostility, and I faced fierce questioning on why the short-distance structure of QCD should survive the hadronization process. My reply was that hadronization was expected to be a soft process involving small exchanges of momenta and that two-jet events had already been seen at SPEAR. At the suggestion of Bjørn Wiik, I also went to Günter Wolf’s office to present the three-jet idea: he listened much more politely than the theorists.

The second paper on three-jet events was published in 1977 by Tom Degrand, Jack Ng and Henry Tye, who contrasted the QCD prediction with that of the constituent-interchange model. Then, in 1978, George Sterman and Steve Weinberg published an influential paper showing how jet cross-sections could be defined rigorously in QCD with a careful treatment of infrared and collinear singularities. In our 1976 paper we had contented ourselves with showing that these were unimportant in the three-jet kinematic region of interest to us. Sterman and Weinberg opened the way to a systematic study of variables describing jet broadening and multi-jet events, which generated an avalanche of subsequent theoretical papers. In particular, Alvaro De Rújula, Emmanuel Floratos, Mary Gaillard and I wrote a paper showing how “antenna patterns” of gluon radiation could be calculated in QCD and used to extract statistical evidence for gluon radiation, even if individual three-jet events could not be distinguished.

Meanwhile, the PETRA collider was being readied for high-energy data-taking with its four detectors, TASSO, JADE, PLUTO and Mark J. I maintained regular contact with Bjørn Wiik, one of the leaders of the TASSO collaboration, as he came frequently to CERN around that time for various committee meetings. I was working with him to advocate the physics of electron–proton colliders. He told me that Sau Lan Wu had joined the TASSO experiment and that he had proposed that she prepare a three-jet analysis for the collaboration. She and Gus Zobernig wrote a paper describing an algorithm for distinguishing three-jet events, which appeared in early 1979.

Proof at last

During the second half of 1978 and the first half of 1979, the machine crews at DESY were systematically increasing the collision energy of PETRA. The first three-jet news came in June 1979 at the time of a neutrino conference in Bergen. The weekend before that meeting I was staying with Bjørn Wiik at his father’s house beside a fjord, when Sau Lan Wu arrived over the hills bearing printouts of the first three-jet event. Bjørn included the event in his talk at the conference and I also mentioned it in mine. I remember Don Perkins asking me whether one event was enough to prove the existence of the gluon: my tongue-in-cheek response was that it was difficult to believe in eight gluons on the strength of a single event!

CCglu3_06_09

The next outing for three-jet events was at the European Physical Society conference in Geneva in July. Three members of the TASSO collaboration, Roger Cashmore, Paul Söding and Günter Wolf, spoke at the meeting and presented several clear three-jet events. The hunt for gluons was looking good!

The public announcement of the gluon discovery came at the Lepton/Photon Symposium held at Fermilab in August 1979. All four PETRA experiments showed evidence: Sam Ting’s Mark J collaboration presented an analysis of antenna patterns; while JADE and PLUTO followed TASSO in presenting evidence for jet broadening and three-jet events. One three-jet event was presented at a press conference and a journalist asked which jet was the gluon. He was told that the smart money was on the left-hand one (or was it the right?). Refereed publications by the four collaborations soon appeared and the gluon finally joined the Pantheon of established particles as the first gauge boson to be discovered after the photon.

An important question remained: was the gluon a vector particle, as predicted by QCD, or was it a scalar boson? In 1978 my friend Inga Karliner and I wrote a paper that proposed a method for distinguishing the two possibilities, based on our intuition about the nature of gluon bremsstrahlung. This was used in 1980 by the TASSO collaboration to prove that the gluon was indeed a vector particle, a result that was confirmed by the other experiments at PETRA in various ways.

CCglu2_06_09

Gluon-jet studies have developed into a precision technique for testing QCD. One-loop corrections to three-jet cross-sections were calculated by Keith Ellis, Douglas Ross and Tony Terrano in 1980 and used, particularly by the LEP collaborations, to measure the strong coupling and its running with energy. The latter also used four-jet events to verify the QCD predictions for the three-gluon coupling, a crucial consequence of the non-Abelian nature of QCD.

In the words of Mary Hopkin’s song in 1968, “those were the days, my friends”. A small group of theoretical friends saw how to discover the gluon and promptly shared the idea with some experimental friends, who then seized the opportunity and the rest – as the saying goes – is history. To my mind, it is a textbook example of how theorists and experimentalists, working together, can advance knowledge. The LHC experiments will be a less intimate environment but let us hope that strong interactions between theorists and experimentalists will again lead to discoveries for the textbooks!

Giuseppe Cocconi and his love of the cosmos

CCnew9_05_09

In 1938 Giuseppe Cocconi published his first paper, “On the spectrum of cosmic radiation”. His last unpublished note of December 2005 bore the title “Arguments in favour of a personal interpretation of extra galactic cosmic rays”. No better indication could be given of his deep interest in astronomy and astrophysics, which lasted until he died in November 2008 aged 94.

The fields that he pioneered are now witnessing exciting new developments. Over the past six months they have reminded us of his many contributions to physics; his simple, direct way to conceive and perform experiments; and his unique way of presenting the subjects that he loved. In this article we describe some of these events and recall what Giuseppe contributed to the various fields.

Ultra-high-energy cosmic rays

CCtri5_06_09

Giuseppe’s interest in the cosmos began when he was in his teens. He would design sundials for friends’ villas around his home town of Como, observe the sky and read as much about it as he could. Late one evening, he happened to observe the fall of some Perseid meteors at an unexpected time. Noting quickly their number and time he transmitted the information to a fellow astronomer – probably the first of his observations to be “published”.

He entered the cosmic-ray scene in February 1938 when he was invited to Rome for six months by Edoardo Amaldi and started working with Enrico Fermi on the construction of a cloud chamber to study cosmic radiation. When Giuseppe returned to Milan he continued to pursue his new interest in cosmic rays, in particular extensive air showers, using Geiger counters to detect them. This was to be the focus of his research for the next 22 years.

CCtri1_06_09

At the time, Pierre Auger had just begun his intensive investigations of air showers. Today this work is honoured in the name of the Pierre Auger Observatory, which is taking the study of the highest-energy cosmic rays to new levels through the detection of very widespread showers. In 1938, electromagnetic showers were understood; mesotrons (muons) were known, but not their interactions; and pions were yet to be discovered. The existence of multiple-particle showers, spread over many square metres, was known – nothing, however, of their origins and little about their composition.

Giuseppe’s work concentrated on the study of the composition of such showers – as a function of their lateral extent, zenith angle, and altitude – in experiments both at sea level and at 2200 m above sea level, at Passo Sella in the Dolomites. Many of these experiments were conducted with Vanna Tongiorgi, who became his wife in 1945. The couple moved to Cornell in 1947 and continued their experiments (some in collaboration with Kenneth Greisen) at Echo Lake on Mt Evans, Colorado, as well as at sea level and at 1600 m water equivalent underground. This vast range of experiments, from 1939 to 1958, contributed considerably to the understanding of cosmic-ray showers: they are produced by the interaction of high-energy nuclei – chiefly protons – with the nuclei of the upper atmosphere.

Even before the discovery of the feature called the “ankle” in the energy spectrum of the primaries in 1960s, Giuseppe realized clearly that the charged primaries with an energy in excess of 1019 eV must come from extragalactic sources because their radius of curvature in the galactic magnetic field is of the same order as the size of our galaxy. In a talk at the 5th International Cosmic Ray Conference (ICRC) in Guanajuato, Mexico, in 1955, he said: “These particles are cosmic, indeed, because even the galaxy seems too small to contain them.”

Giuseppe maintained his great interest in the physics of cosmic rays throughout his life. When he was informed that the Pierre Auger Observatory had started to operate in 2002 and had detected high-energy showers, he replied by writing “Mi ringiovanisci di cinquant’anni rinfrescando i miei primi amori (You make me 50 years younger by reminding me of my first love)”. His first love was, of course, the physics of cosmic rays. Jim Cronin, founder and first spokesperson of the observatory, recalls receiving “a wonderful congratulatory letter following our publication on 28 November 2007”, when the collaboration announced the discovery that active galactic nuclei are the most likely candidates for the source of the ultra-high-energy cosmic rays arriving on Earth. The discovery confirmed Giuseppe’s hypothesis from 50 years earlier that the highest-energy component in cosmic rays is of extragalactic origin. The Pierre Auger Observatory was inaugurated a year later, on 14 November 2008, only a few days after he passed away. One of us (GM), a long-time collaborator of Giuseppe, gave a speech as the current spokesperson of the collaboration.

Gamma rays from the cosmos

CCtri2_06_09

The vast majority of cosmic-ray showers originate with charged primary particles, mainly protons and nuclei not heavier than iron, but a small fraction arise from the interaction of high-energy gamma rays in the atmosphere. At the 1959 ICRC in Moscow, while on leave at CERN from Cornell, Giuseppe suggested the possibility of detecting cosmic sources of high-energy photons using coincidence techniques to separate unidirectional photons from the isotropic background. He proposed that the Crab Nebula might be a strong source of gamma-rays in the tera-electron-volt range. The paper motivated Aleksandr Chudakov of the Lebedev Institute to build a pioneering gamma-ray telescope in the Crimea, designed to detect the short bursts of Cherenkov light generated in the atmosphere by extensive air showers, which had been first observed by Bill Galbraith and John Jelley at Harwell in the UK in 1953. Finally, in 1989, the Whipple air-Cherenkov telescope in the US detected the Crab Nebula as, indeed, a source of tera-electron-volt gamma rays.

Second-generation imaging air-Cherenkov telescopes (IACTs) – HESS, MAGIC and VERITAS – now cover the northern and southern hemispheres, detecting point-like and extended sources with a typical angular resolution of an arcminute. This means that galactic sources, such as supernova remnants (SNRs), can be imaged with a resolution smaller than their angular extension. A recent result from the HESS telescopes in Namibia on the emission from the nearest active galactic nucleus, Centaurus A, could explain the small cluster of a few events of ultra-high-energy cosmic rays that the Pierre Auger Observatory has observed in this direction.

Giuseppe enjoyed the discovery last year by MAGIC of very high-energy gamma rays from the active nucleus of the 3C279 galaxy. This quasar is at a distance of roughly half the radius of the universe, which is more than twice the distance of objects previously observed in gamma rays. The MAGIC Collaboration thus concludes that the universe appears more transparent at cosmological distances than previously believed, precluding significant contributions from light other than from sources observed by current optical and infrared telescopes.

The new IACTs are now complementing observations by gamma-ray telescopes in space. Giuseppe was interested in the results from two recent missions: AGILE, launched on 23 April 2007; and Fermi, launched on 11 June 2008. These missions are collecting important data on galactic and extragalactic sources in the energy range 100 MeV–100 GeV and should provide a wealth of information for understanding the sources of particle acceleration. These include gamma-rays bursts (GRBs), which are the highest-energy phenomena occurring in the universe since the Big Bang. It is no surprise that Giuseppe developed a recent interest in GRBs, reinforced by frequent discussions on the subject with Alvaro de Rújula at CERN. In 2008 the Fermi mission detected the most energetic GRB so far observed, GRB 080916C, at a distance of 12.2 thousand million light-years.

It was his interest in gamma rays that sparked the work for which Giuseppe became most widely known outside particle and astrophysics, after he and Philip Morrison (visiting CERN from Cornell) published a two-page article in Nature on “Searching for interstellar communications”. Morrison recalled that: “One spring day in 1959, my ingenious friend Giuseppe Cocconi came into my office and posed an unlikely question: would not gamma rays, he asked, be the very medium of communication between stars?” Morrison agreed but suggested that they should consider the entire electromagnetic spectrum. In the resulting paper they argued for searching around the emission frequency at 1420 MHz, corresponding to the 21 cm line of neutral hydrogen. Giuseppe contacted Sir Bernard Lovell at Jodrell Bank in the UK, which had the largest radio telescope at the time, but Lovell was sceptical, and nothing came of the proposal to devote some time towards searching for an extraterrestrial signal. The first radio search for an alien signal was left to others, initially to the Ozma project, which was started independently by Frank Drake in 1959. Later, the Search for Extraterrestrial Intelligence (SETI) became a serious research topic, capturing the public’s imagination. Now, anyone with a computer can contribute to the search through SETI@home.

Rising cross-sections

The letter quoted above written to one of us (GM) in 2002 ends as follows: “We do not yet know from where the local cosmic rays are coming. Will I live long enough to know? Move fast and keep me informed … Meanwhile cross-sections and scatterings continue their quiet life, following the new machine with the square of the logarithm.” The last sentence refers, of course, to the LHC and to the proton–proton total cross-section experiments planned by the TOTEM Collaboration. Giuseppe’s second main physics interest, after cosmic rays, was proton–proton scattering. This began at CERN at the PS in 1961, continuing at Brookhaven with measurements at the highest momentum transfer so far and, from 1971, at the Intersecting Storage Rings (ISR).

In 1965 Giuseppe proposed with Bert Diddens and Alan Wetherell the use of the first extracted proton beam from the PS to measure elastic and inelastic cross-sections. The first experiment was on proton–proton scattering with large momentum-transfer. A few years earlier the same group had measured the shrinking of the forward elastic peak. This discovery gave an enormous boost to the phenomenology of Regge poles, which was fashionable at the time. The ensuing interpretation of the energy dependence of the total hadron–hadron cross-sections in terms of Pomeron exchange predicted an almost constant value of the cross-section with energy – a high-energy regime called “asymptopia”, which seemed to be round the corner and would be characterized by increasing interaction radii and decreasing central opacity.

CCtri4_06_09

In 1970 Giuseppe’s group joined the Rome ISS group of two of us (UA and GM), who had proposed to the ISR Committee the measurement of elastic-scattering events through the detection of protons scattered only a few millimetres from a circulating-proton current of many amperes. The movable parts that contained the detectors were soon called “Roman pots”. Giuseppe very much enjoyed such a small and delicate experiment. He would spend long hours gluing together thin scintillators and measuring the position of the counters in the ISR reference frame with theodolites.

By applying the optical theorem, the CERN–Rome group found that the proton–proton cross-section rises with energy. The results were published together with a paper by the Pisa–Stony Brook collaboration, who had detected the same phenomenon by measuring the total interaction rate. In parallel, the movable pots were used to measure the interference between the Coulomb amplitude and the nuclear amplitude, which was discovered to be positive and rising with energy; a consequence – through dispersion relations – of the fact that the total cross-section continues to rise at collision energies that were not directly attainable at the ISR.

The ISR best-fit gave a total proton–proton cross-section that rose as the square of the logarithm of the energy, behaviour that was confirmed by later experiments with Roman pots at the SPS and the Tevatron. It was to this that Giuseppe was referring when he wrote of cross-sections “continuing their quiet life” while waiting for TOTEM. He may have been disappointed that most physicists did not seem to realize the importance of this discovery. In the 1960s, asymptopia dominated; essentially, nobody thought that the cross-sections could rise with energy. Even Vladimir Gribov made the hypothesis that they might be slowly decreasing, despite the observation at Serpukhov that the kaon–proton cross-section was increasing slightly. Some theoreticians – such as Marcel Froissart and André Martin, Nick Khuri and Tom Kinoshita – envisaged, from a purely mathematical point of view, that there could be a rising cross-section and tried to see the consequences. The only serious model was the one proposed by H Cheng and T T Wu in 1968.

Giuseppe was very interested in seeing what would be found in the new energy range and one of his last topics of conversation was the incident on 19 September that brought the commissioning of the LHC to a halt. He was clearly disappointed because he hoped to see proton–proton collisions at really high energies.

Unity in physics

Whenever he could Giuseppe would use accelerator data to illuminate an open problem in cosmic-ray physics, and vice versa. A typical example is the paper published with one of us (GB) in Nature in 1987 in which a relevant limit is put on the neutrino electric charge by calculating the dispersion of the time of flight of the neutrinos produced by SN1987a and detected by Kamiokande. He later applied a similar method to the photon pulses emitted by the millisecond pulsar PSR 1937+21 to get a limit on the photon’s electric charge.

His view of a basic unity in physical science, from galaxies to elementary particles, was clear in a series of lectures that he delivered at CERN more than 20 years ago. Following an invitation from André Martin, who at the time was chairperson of the Academic Training Committee, Giuseppe gave a course on “Correlations between high-energy physics and cosmology” in 1980. In these lectures he illustrated what he believed, at the time, were the important problems that could strengthen the relations between particle physics and cosmology – the field now known as astroparticle physics. The main themes of the past 20 years were all present: from the analysis of extragalactic emissions (with particular attention to the cosmic microwave background radiation) to the measurements of the deceleration parameter of the cosmic expansion. The series was so successful that the committee invited him to lecture again in 1984, this time on “A new branch of research: Astronomy of the most energetic gamma rays”. It, too, was a great success.

In a paper written to celebrate Edoardo Amaldi’s 60th birthday, Giuseppe expressed his continuing vision of science: “A common aim of people interested in science is that of improving the comprehension of phenomena that can be observed in the world.” Throughout his long life in science he made many contributions to improving this comprehension, through his particular approach to research. Many years after his retirement, he continued to impress younger colleagues at CERN, some of whom would hand him their recent papers for comments and advice, as Massimo Giovannini recalls. “His comments were always sharp and precise … for Giuseppe one aspect of research was the art of phrasing the complications of a phenomenon in simple numerical terms.” This was perhaps best summarized by Nobel laureate Sam Ting in his Nobel prize speech in 1976: “…I went to CERN as a Ford Foundation Fellow. There I had the good fortune to work with Giuseppe Cocconi at the Proton Synchrotron, and I learned a lot of physics from him. He always had a simple way of viewing a complicated problem, did experiments with great care and impressed me deeply.”

• The authors are grateful to Jack Steinberger and André Martin for contributions on Cocconi’s cosmic-ray experiments and on the meaning of the discovery of the rising proton–proton cross-section.

1959: the birth of the CERN Courier

CCann1_06_09

The story of CERN Courier all began about a year earlier when an advertisement in the Belgian press mentioned that a international research organization based in Geneva was going to start its own periodical. It was intended to be an internal public-relations gesture meant to inform its staff of what was going on within its premises. The organization’s acronym, CERN, meant little, to say nothing, to the average reader – this writer included. Nevertheless, some months later he found himself the new member of the organization’s diminutive Public Information Office. Here he was endowed with the task of initiating a publication that reflected the high motivation of a staff dedicated towards building and operating a couple of large “atom-smashing” accelerators.

The job was a typical public-relations venture aimed at fewer than 900 souls, which may nowadays seem mild and benign compared with the complexity of today’s communication assignments. Still, the task featured several aspects that had to be addressed by a newcomer in a foreign environment. This was to be carried out within an organization that, for all its culture of openness, was far from familiar with disseminating its doings in simple terms.

Questions first

Among the challenges to be resolved, the most prominent was: what support could be expected from management? Fortunately, this proved to be just an academic question because the project was the brainchild of Cornelis Bakker. As director-general, his ideas on the subject were not challenged by his administration.

Then, among the practical problems, one had to secure a budget, which meant coaxing the finance office (FO) into allocating the odd sum. In fact, the amount was small enough that it could not be found recently in the FO’s archives. Fortunately the princely figure of SFr 7200 a year has surfaced out of this writer’s notes from the time. No need then, to wonder why the inclusion of paying advertisements in an international house publication was also first invented at CERN. This “invention”, although not quite as resounding as that of Tim Berners-Lee 30 years later, certainly helped in the survival of the infant CERN Courier. It must be said, however, that the scheme did not prove easy to manage, leading to some controversies about what contents could or could not be accepted. Still, the proof of the idea’s soundness was in its longevity and that the model was soon borrowed by other organizations.

The format was a major topic that covered several questions such as title, contents and illustration, language, size, paper weight, periodicity and distribution. Considerations on the publication’s title led to some hesitation. The name CERN Reporter was initially suggested but finally our one-man, self-appointed committee stumbled on CERN Courier, a “nom de guerre” that was accepted by the powers that were. It has stuck so far.

Deciding what the contents would include was perhaps the easier part of the production chain to tackle. Indeed, the development phase of CERN, with its two large (for the time) contraptions called accelerators – a 600 MeV synchrocyclotron and the 25 GeV (initially 24.3 GeV at 12 kG) proton synchrotron – was ripe with a myriad of possible stories that were both scientific and mundane. Editorial content that involved policies was routinely submitted to the director-general, who was always readily available for advising or checking. The approval of “reported” articles was, of course, always obtained from the interviewees themselves. As for illustrations, financial considerations (restricted to between 25% and 30% of the budget) and printing state-of-the-art limited them to black and white.

Another question concerned which language (or languages) to use but the answer was obvious, because English and French were the two official languages of the organization – and still are. Initially, and for many years, two separate editions came out – Courrier CERN and CERN Courier. A decision by the CERN management in 2005 reduced the French edition of the current Courier to an embryo-sized state, thus jeopardizing the interest of a large segment of non-English-speaking staff and workers. Perhaps a bilingual formula could have been chosen to alleviate production costs.

Deciding what format, frequency and circulation should be adopted for the publication proved to be tricky questions, with answers that were, of course, set by costs. However, another factor soon came to light: the time available for editorial production. Indeed, the choice of a monthly versus weekly periodical suddenly became self-evident when, in view of his superior’s untimely death, the budding editor found himself responsible not only for his newborn publication but also for most of CERN’s other public relations involvements such as visits – be they general or by VIPs – and press contacts. The initial print run of 1000 copies allowed for a distribution to staff, who numbered 886 at the end of 1959. However, the interest generated from outside circles – the press, individuals and other organizations and labs – warranted that circulation quickly rose to 2000 copies by March 1960.

Meanwhile, the choice of a printer had arisen. Who could supply an 8-page, A4-size product printed on machine-finish paper? Three quotes were obtained from local printers and Chérix & Filanosa Cy in Nyon was selected. For distribution it was decided to have the publication sent through external post, primarily to the homes of staff members – with the hope of involving and interesting their respective families, whose influence on staff morale could not be underestimated.

The world premiere

CCann2_06_09

With all of those items mastered, the first issue appeared in mid-August 1959. It was a modest 8-page endeavour but even so it was well received by the “Cernois/Cernites” (yes, we coined the name that early!). Even outsiders responded favourably as witnessed among others by Albert Picot, a Geneva statesman doubling as an inveterate autodidact, and by a British member of CERN Council, H L Verry, who found it “excellent”.

Over the years, the advent of the Weekly Bulletin in 1965 allowed the CERN Courier to switch from being the house publication to a scientific journal. The Courier thus became the ambassador of CERN and particle physics to a large community of knowledgeable specialists and inquisitive people. Indeed, the trend had been set when, soon after its inception, a special issue of the Courier was devoted entirely to the PS, coming out in time for the machine’s inauguration on 5 February 1960.

Today, reflecting on the perspective of the CERN Courier after 50 years, it is rewarding to see that the once-straightforward attempt at promoting subnuclear research survived the vagaries of time. Personally, the privilege of having worked at CERN half a century ago makes one proud to have been associated – albeit in a small way – in the building and strengthening of what was, as the then president of council, François de Rose, said, “the greatest venture in international co-operation ever undertaken in the world of science”.

CERN Courier Archives: 1959–2009

Beam in four months

CCcca1_06_09

The most important event that has yet happened at CERN is the subject of the press release issued on 25 November. This news, which came just as the first proofs of this issue were coming off the press, was important enough to warrant rearranging the lay-out.

On July 27th the 100 units forming the magnet of the proton synchrotron were energized for the first time … On 13 October, after the radio frequency accelerating system had come into operation, events moved fast. On the 15th, an accelerated beam was observed during a few milliseconds. On 22 October the energy reached 400 MeV.

It was 7.40 p.m. on 24 November when the beam was accelerated to approximately 24 GeV, twenty-four thousand million electronvolt, i.e. the maximum energy under normal operating conditions. The acceleration was steady; moreover, 90% of the proton beam trapped by the synchrotron reached maximum energy. According to the physicists, this proportion is surprisingly high.

On the morning of 25 November all of the members of the Proton Synchrotron Division gathered in the main auditorium. John B Adams, under whose leadership CERN’s gigantic project has been successfully carried out, gave an account of the operations of the last few days. Expressing his gratitude to all those who, at CERN, had played a part in constructing and bringing the accelerator into operation, he announced: “Nuclear physicists will soon be able to use the machine.”

Next, Professor C J Bakker, director-general of CERN, said: “Of course, such a machine could only be the result of team work. But the team could not have worked at full pitch without the impetus of a leader: this leadership was provided by J B Adams. It is with the greatest of pleasure that I convey to him and his division the warmest congratulations of the president of the Council.”

• November 1959 pp1, 6–7 (extract)

 

Quarks and aces come to CERN

In February, a number of events combined to provide the kind of excitement for the physicists that more than makes up for the long periods of monotony and to make the rest of the staff somewhat more aware than usual that interesting things were happening.

The clues to part of the excitement had, in fact, been available in the library for a week or two, in the form of “preprints” of two theoretical papers, one by M Gell-Mann, of the California Institute of Technology, US, and the other by G Zweig, of the same Institute but at present a visiting scientist at CERN. Gell-Mann’s paper was published in Physics Letters on 1 February; Zweig’s, the more detailed of the two, is expected to appear later in Physical Review. Produced independently, both papers put forward a possible new way of looking at the theory of “unitary symmetry” known as SU3

…The new ideas had a basic simplicity that was very appealing, and difficulties that had to be explained away in the former versions of the theory did not seem to arise this time, yet the idea of fractionally charged particles seemed quite preposterous. Even those who had suggested it seemed to share the doubts; Gell-Mann called his new particles “quarks”, bringing together literature and science with a reference to Finnegans Wake! Zweig turned to the field of card games for inspiration, and called his particles “aces”, with their combinations “deuces” and “treys”.

• March 1964 pp26–27 (extract).

 

Inauguration of the PS

Prof. J Robert Oppenheimer, director, Institute for Advanced Study, Princeton, speaking on behalf of the American Physical Society and of the National Academy of Sciences: “We wish you a future of new discovery, of increased understanding of nature, as a bright example of that co-operation which is required of us, for our survival and for the flourishing of high culture.

…We salute the vision and devotion of those who have made possible the proton synchrotron. We recognize not only that it marks a technical achievement of high significance, but also that it is a symbol of the common enterprise of people from many nations to give to all mankind new understanding of the forces that shape our physical environment.

…May those that work at CERN in the years to come find there, in steadily growing knowledge of the wondrous order of nature and of nature’s laws, ever renewed challenge for the questing mind and ever deepening satisfaction for the questing spirit.”

• March 1960 pp6–12 (extract).

 

The first g-2 experiment

CCcca2_06_09

The issue for April 1962 featured the g-2 experiment, with a photo of the 6 m magnet appearing on the cover. The magnet was the heart of the first g-2 experiment, the aim of which was to measure accurately the anomalous magnetic moment, or g-factor, of the muon. This experiment was one of CERN’s outstanding contributions to physics and for many years was unique to the laboratory. Indeed, three generations of the experiment were performed at CERN during its first 25 years.

Happy 50th, CERN Courier

CC50y1_06_09

This August, the CERN Courier is 50 years old. That’s a good excuse to take stock of what’s changed and what’s stayed the same, so I found myself a copy of issue number 1 (reprinted in the following pages). With the Courier, it’s remarkable to see the ambition contained in that first edition, and to see how much the magazine has remained faithful to its founder Cornelis Bakker’s original vision.

Visually the CERN Courier has changed beyond recognition, as has the laboratory itself. The audience has changed too. Originally conceived as an internal newsletter, the Courier today addresses a global readership of more than 25,000. One thing that has stayed the same, however, is the magazine’s openness to the world. Issue number 1 reported not only on progress towards starting up the PS, but also carried news of the City of Hamburg’s purchase of a 40 MeV linac for a new lab known as the Deutsches Elektronen Synchrotron. Back then, the Courier felt the need to spell out the DESY acronym. There was also news from the US, including bold ambitions for linear accelerator developments at Stanford University. CERN’s mission of bringing nations together for peaceful collaboration is witnessed by a report from a trip to the USSR, precursor to a long and fruitful collaboration with the Joint Institute for Nuclear Research at Dubna.

The introduction on the first page of that first issue asks the question “what will the CERN Courier be?” It goes on to explain that it is there to “maintain the ideal of European co-operation and the team spirit which are essential to the achievement of our final aim: scientific research on an international scale”. Fifty years on, the world has changed immeasurably, but those words still ring true. Let’s look forward to the next 50 years!

Rolf Heuer, director-general.

To celebrate the 50th anniversary of the CERN Courier, in this issue we have reproduced the original edition in its entirety. Since then the magazine has covered numerous dramatic discoveries and breakthroughs at CERN and elsewhere. On pages 25–28 we give just a small selection of highlights.

 

 

CHEP ’09: clouds, data, Grids and the LHC

CCche1_06_09

The CHEP series of conferences is held every 18 months and covers the wide field of computing in high-energy and nuclear physics. CHEP ’09, the 17th in the series, was held in Prague on 21–27 March and attracted 615 attendees from 41 countries. It was co-organized by the Czech academic-network operator CESNET, Charles University in Prague (Faculty of Mathematics and Physics), the Czech Technical University, and the Institute of Physics and the Nuclear Physics Institute of the Czech Academy of Sciences. Throughout the week some 500 papers and posters were presented. As usual, given the CHEP tradition of devoting the morning sessions to plenary talks and limiting the number of afternoon parallel sessions to six or seven, the organizers found themselves short of capacity for oral presentations. They received 500 offers for the 200 programme slots, so the remainder were shown as posters, split into three full-day sessions of around 100 each day. The morning coffee break was extended specifically to allow time to browse the posters and discuss with the poster authors.

A large number of the presentations related to some aspect of computing for the up-coming LHC experiments but there was also a healthy number of contributions from experiments elsewhere in the world, including Brookhaven National Laboratory, Fermilab and SLAC (where BaBar is still analysing its data although the experiment has stopped data-taking) in the US, KEK in Japan and DESY in Germany.

Data and performance

CCche2_06_09

The conference was preceded by a Worldwide LHC Computing Grid (WLCG) Workshop, summarized at CHEP ’09 by Harry Renshall from CERN. There was a good mixture of Tier 0, T1 and T2 representatives in the total of the 228 people present at the workshop, which began with a review of each of the LHC experiment’s plans. All of these include more stress-testing in some form or other before the restart of the LHC. The transition to the European Grid Initiative from the Enabling Grids for E-sciencE project is clearly an issue, as is the lack of a winter shutdown in the LHC plans. There was discussion on whether or not there should be a new “Computing Challenge”, to test the readiness of the WLCG. The eventual decision was “yes”, but to rename it STEP ’09 (Scale Testing for the Experimental Programme), schedule it for May or June 2009 and concentrate on tape recall and event processing. The workshop concluded that ongoing emphasis should be put on stability, preparing for a 44-week run and continuing the good work that has now started on data analysis.

Sergio Bertolucci, CERN’s director for research and scientific computing, gave the opening talk of the conference. He reviewed the LHC start-up and initial running, the steps being taken for the repairs following the incident of 19 September 2008 as well as to avoid any repetition, and the plans for the restart. He compared the work currently being done at Fermilab, and how CERN will learn from this in the search for the Higgs boson. Les Robertson of CERN, who led the WLCG project through the first six years, discussed how we got here and what will come next. A very simple Grid was first presented at CHEP in Padova in 2000, leading Robertson to label the 2000s as the decade of the Grid. Thanks to the development and adoption of standards, Grids have now developed and matured, with an increasing number of sciences and industrial applications making use of them. However, Robertson thinks that we should be looking at locating Grid centres where energy is cheap, using virtualization to share processing power better, and starting to look at “clouds”: what are they in comparison to Grids?

The theme of using clouds, which enable access to leased computing power and storage capacity, came up several times in the meeting. For example, the Belle experiment at KEK is experimenting with the use of clouds for Monte Carlo simulations in its planning for SuperBelle; and the STAR experiment at Brookhaven is also considering clouds for Monte Carlo production. Another of Robertson’s suggestions for future work, “virtualization”, was also one of the most common topics in terms of contributions throughout the week, with different uses cropping up time and again in the conference’s various streams.

Other notable plenary talks included those by Neil Geddes, Kors Bos and Ruth Pordes. Geddes, of the UK Science and Technology Facilities Council Rutherford Appleton Laboratory, asked “can WLCG deliver?” He deduced that it can, and in fact does, but that there are many challenges still to face. Bos, of Nikhef and the ATLAS collaboration, compared the different computing approaches across the LHC experiments, pointing out similarities and contrasts. Femilab’s Pordes, who is executive director of the Open Science Grid, described work in the US on evolving Grids to make them easier to use and more accessible to a wider audience of researchers and scientists.

The conference had a number of commercial sponsors, in particular IBM, Intel and Sun Microsystems, and part of Wednesday morning was devoted to speakers from these corporations. IBM used its slot to describe a machine that aims to offer cooler, denser and more efficient computing power. Intel focused on its effort to get more computing for less energy, making note of work done under the openlab partnership with CERN (CERN openlab enters phase three). The company hopes to address this partially by increasing computing-energy efficiency (denser packaging, more cores, more parallelism etc) because it realizes that power is constraining growth in every part of computing. The speaker from Sun presented ideas on building state-of-the-art data centres. He claimed that raised floors are dead and instead proposed “containers” or a similar “pod architecture” with built-in cooling and a modular structure connected to overhead, hot-pluggable busways. Another issue is to build “green” centres and he cited solar farms in Abu Dhabi as well as a scheme to use free ocean-cooling for floating ship-based computing centres.

It impossible to summarize in a short report the seven streams of material presented in the afternoon sessions but some highlights deserve to be mentioned. The CERN-developed Indico conference tool was presented with statistics showing that it has been adopted by more than 40 institutes and manages material for an impressive 60,000 workshops, conferences and meetings. The 44 Grid middleware talks and 76 poster presentations can be summarized as follows: production Grids are here; Grid middleware is usable and is being used; standards are evolving but have a long way to go; and the use of network bandwidth is keeping pace with technology. From the stream of talks on distributed processing and analysis, the clear message is that much work has been done on user-analysis tools since the last CHEP, with some commonalities between the LHC experiments. Data-management and access protocols for analysis are a major concern and the storage fabric is expected to be stressed when the LHC starts running.

Dario Barberis of Genova/INFN and ATLAS presented the conference summary. He had searched for the most common words in the 500 submitted abstracts and the winner was “data”, sometimes linked with “access”, “management” or “analysis”. He noted that users want simple access to data, so the computing community needs to provide easy-to-use tools that hide the complexity of the Grid. Of course “Grid” was another of the most common words, but the word “cloud” did not appear in the top 100 although clouds were much discussed in plenary and parallel talks. For Barberis, a major theme was “performance” – at all levels, from individual software codes to global Grid performance. He felt that networking is a neglected but important topic (for example the famous digital divide and end-to-end access times). His conclusion was that performance will be a major area of work in the future as well as the major topic at the next CHEP in Taipei, on 17–22 October 2010.

KEKB breaks luminosity record

A team working at the KEKB electron–positron collider at the KEK laboratory in Japan has broken the machine’s existing world record for luminosity by using new accelerator components called “crab cavities”. The new record is almost a factor of two higher than the original design luminosity of KEK’s B-factory.

Until 2007 the electron and positron bunches in the KEKB accelerator beams crossed at an angle of 22 mrad. The crossing angle, a unique feature of the KEKB design, provided an effective separation of the beams after collision, avoiding a high background in the detectors. Its success was evident in the world-beating luminosities that the collider achieved previously. To boost the luminosity further, however, a scheme was required that would allow an effective head-on collision between the beams while still retaining the crossing angle. To accomplish this goal, the team at KEKB designed and built special superconducting RF cavities that kick each beam sideways in the horizontal plane so that the bunches collide head-on at the interaction point. These crab cavities for linear electron–positron colliders were first proposed almost 30 years ago by Robert Palmer and in 1989, K Oide and K Yokoya proposed using them in storage rings. This was followed in around 1992 by the development of designs and prototype models by K Akai as part of collaborative work between KEK and Cornell. Detailed engineering and prototyping by K Hosoyama’s team then took place at KEKB to converge on the current design.

Full-sized cavities were developed after intense discussions and elaboration. The first pair were finally installed at KEKB in January 2007 and detailed commissioning began a month later (CERN Courier September 2007 p8). Recently the team achieved a breakthrough by controlling the behaviour of off-energy beam particles using special skew-sextupole magnets. On 6 May the machine broke the world record, reaching a luminosity of 1.96×1034 cm2/s with the crab cavities. At the same time, the background remained at a good level and data continued to be recorded smoothly in the Belle experiment.

For the future, a B-factory upgrade called SuperKEKB is being planned and designed in Japan. This will build on the experience and hardware developed at KEKB and will increase the luminosity by a factor of 40. The recent breakthrough and the long history of world luminosity records at KEKB suggest that this future machine will achieve its goals. A large international collaboration has been formed to upgrade the Belle detector to observe the collisions at the new high-luminosity facility.

Final magnet for sector 3-4 goes underground

CCnew4_05_09

The final magnet – a quadrupole short straight section – to refit sector 3-4 of the LHC was lowered into the tunnel and transported to its location on 30 April, two weeks after the 39th and final, repaired dipole magnet was lowered and installed. This magnet system was the last of the spares to be prepared for use in the refurbished sector .

With all of the necessary magnets now underground, work in the tunnel will continue to connect them together. In total 53 magnets were removed from sector 3-4 following the incident on 19 September 2008. Of these, 16 magnets had sustained minimal damage and so were refurbished and put back into the tunnel; the remaining 37 were replaced by spares, depleting the number of reserve magnets to nearly zero. Work will continue on the surface to repair the remaining damaged magnets to replenish the pool of spares.

Since the start of the repair work in sector 3-4, the Vacuum Group has been cleaning the beam pipes to remove metallic debris and soot created by the electrical arc at the root of the incident. All 4800 m of the beam pipes in sector 3-4 were first surveyed centimetre by centimetre to document the damage before the cleaning work began. The cleaning process itself involves passing a brush through the pipe to clean the surface mechanically, followed by vacuuming to remove any debris both inside and outside the beam pipe. This procedure is repeated ten times, followed by a final check with an endoscopic camera. By the end of April some 70% of the affected zone had been cleaned.

CCnew5_05_09

Work meanwhile continues on the installation of new pressure release ports to allow a greater rate of helium escape in the event of an incident similar to that of 19 September. This is now proceeding in the areas outside the arc sections – in particular on the inner triplets (the focusing magnets either side of the collision point). The ports have been slightly modified to fit the geometry of these magnets.

The root of the incident on 19 September was a splice failure interconnection between two magnets and since then CERN has developed highly sensitive methods to detect resistances of splices at the nano-ohm level. These have revealed a small number of splices with abnormally high resistance, which are being investigated, understood and dealt with. Now a new test has been developed to measure the electrical resistance of the connection joining the busbars of the superconducting magnets together. Each busbar consists of a superconducting cable surrounded by a larger copper block. Although the copper cannot carry the same level of current as the superconducting cable for sustained periods, it plays the essential role of providing a low resistance path to the current when a magnet or a busbar quenches: the copper gives time to the protection system to discharge the stored energy. The new test allows the electrical continuity of the copper part to be checked and so provides another important quality control safety check for the electrical connections.

Careful tests have revealed that in some cases, the process of soldering the superconductor in the interconnecting high-current splice can melt the solder joining the superconducting cable to the copper of the busbar, and thereby impede its ability to do its job if a quench occurs. As a result, the teams at work on the consolidation process are improving the soldering process, and checking the whole of the LHC for similar faults. A test has been done for sectors at room temperature and studies are now going on to allow the same procedure at cryogenic, but non-superconducting temperatures. By mid-May, three sectors had been tested at room temperature, and five potentially faulty interconnections found. These are being repaired accordingly.

• For up-to-date news, see The Bulletin.

BEPCII/BESIII accumulates 100 million ψ(2S) in Beijing

CCnew6_05_09

After five years of construction, the upgraded Beijing Electron–Positron Collider (BEPCII) and the new Beijing Spectrometer (BESIII) finished accumulating their first large data set of more than 100 million ψ(2S) events on 14 April. This is the world’s largest ψ(2S) data set. Data taking started on 6 March, following a scan of the ψ(2S) peak. During the following month, as machine commissioning continued, the peak luminosity of BEPCII increased steadily from 1.4 to 2.3×1032 cm–1s–2, with beam currents of 550 mA for both electrons and positrons.

The commissioning of the upgraded accelerator and the new detector began in summer 2008, with the first event observed on 18 July. Approximately 13 million ψ(2S) events were obtained last autumn, providing data for studies of the new detector and for calibration. The results show that the detector performance is as expected: efficiency, resolution and stability all meet specifications. The new data sample of 100 million ψ(2S) events will allow more-detailed studies of detector performance, as well as many physics analyses, for example of hc, ψc, and ηc charmonium states. After some accelerator studies, BEPCII and BESIII will now turn to running at the J/ψ peak, with the goal of collecting a high-statistics sample of J/ψ events.

BEPCII, the upgrade of BEPC at the Institute of High Energy Physics (IHEP) in Beijing, is a two-ring collider operating between 1 and 2.2 GeV beam energy in the charm energy region. It has a design luminosity of 1×1033 cm–2s–1 at 1.89 GeV, which is an improvement by two orders of magnitude on its predecessor. The BESIII detector features a beryllium beam pipe; a small-cell, helium-based drift chamber; a time-of-flight system; a CsI(Tl) electromagnetic calorimeter; a 1 T superconducting solenoid magnet; and a muon identifier using the magnet yoke interleaved with resistive plate chambers. The BESIII collaboration consists of groups from Germany, Italy, Japan, Russia, the US, as well as many Chinese Universities and IHEP.

T2K beamline starts operation

CCnew7_05_09

On 23 April, the Tokai-to-Kamioka (T2K) long-baseline neutrino oscillation experiment confirmed the first production of the neutrino beam by observing the muons produced by the proton beam in the neutrino facility at the Japan Proton Accelerator Complex (J-PARC).

The T2K experiment uses a high-intensity proton beam at J-PARC at Tokai to generate neutrinos that will travel 295 km to the 50 kt water Cherenkov detector, Super-Kamiokande, which is located about 1000 m underground in the Kamioka mine (CERN Courier July/August 2008 p19). The experiment follows in the footsteps of KEK-to-Kamioka (K2K), which generated muon neutrinos at the12 GeV proton synchrotron at KEK.

With the beam generated at the J-PARC facility, T2K will have a muon-neutrino beam 100 times more intense than in K2K. This should allow the experimenters to measure θ13, the smallest and least well known of the angles in the neutrino mixing matrix, which underlies the explanation of neutrino oscillations. Experiments with atmospheric neutrinos have found the mixing angle θ23 to be near to its maximal value of 45°, while the long-standing solar neutrino problem has been solved by neutrino oscillations with a large value for θ12 (Borexino homes in on neutrino oscillations).

bright-rec iop pub iop-science physcis connect