Comsol -leaderboard other pages

Topics

STAR finds heaviest antinucleus

Studies of high-energy collisions of gold ions by the STAR collaboration at the Relativistic Heavy Ion Collider (RHIC), Brookhaven, have revealed evidence of the most massive antinucleus to date. The new antinucleus is an antihypertriton – a negatively charged state containing an antiproton, an antineutron and a Λ. It is also the first antinucleus containing a strange antiquark.

The new state is related to antihelium-3, with the Λ replacing one of the neutrons. The STAR team identified it via its decay into antihelium-3 and a positive pion. Altogether, in an analysis of hundred million collisions, they found 70 ± 17 antihypertritons and 157 ± 30 hypertritons (consisting of pnΛ).

In heavy-ion collisions only a tiny fraction of the emitted fragments are light nuclei, but these states are of fundamental interest. The STAR team finds that the measured yields of hypertritons (antihypertritons) and helium-3 (antihelium-3) are similar. This suggests an equilibrium in the populations of up, down, and strange quarks and antiquarks, contrary to what is observed at lower collision energies.

Super-Kamiokande sees first T2K event

CCnew7_04_10

The international Tokai-to-Kamioka (T2K) collaboration announced the first detection of a long-distance neutrino in the Super-Kamiokande detector on 24 February. The neutrino had travelled 295 km under the Earth’s surface from the beamline at the Japan Proton Accelerator Research Complex (J-PARC) in Tokai, north of Tokyo, to the gigantic Super-Kamiokande underground detector in an old mine near the west coast of Japan.

The T2K experiment uses a high-intensity proton beam at J-PARC in Tokai to generate neutrinos that travel to the 50 kt water Cherenkov detector, Super-Kamiokande. The experiment follows in the footsteps of KEK-to-Kamioka (K2K), which generated muon neutrinos at the 12 GeV proton synchrotron at KEK. With the beam generated at the J-PARC facility, T2K will have a muon-neutrino beam 100 times more intense than in K2K.

The experiment has been built to make high-precision measurements of known neutrino oscillations, and to look for the so-far unobserved type of oscillation that would cause a small fraction of the muon-neutrinos produced at J-PARC to become electron-neutrinos by the time they reach Super-Kamiokande.

CERN and JINR sign new agreement

CCnew8_04_10

CERN and JINR have a long and successful history of collaboration – the first informal meeting on international co-operation in the field of high-energy accelerators took place at CERN in 1959 – and both provided a bridge between East and West for decades. In 1992 they signed a co-operation agreement that included an important number of protocols covering JINR’s participation in the construction of the LHC and the ALICE, ATLAS and CMS detectors, as well as in information technology. JINR has also made valuable contributions to smaller experiments at CERN.

Now that the major obligations undertaken by JINR for the construction of the LHC and its experiments have been met, CERN and JINR have decided to continue and reinforce their co-operation in the fields of particle physics, accelerator physics and technologies, educational programmes and the development of administrative and financial tools, mutually contributing to the scientific programmes of both laboratories. On 28 January, JINR’s director Alexei Sissakian and CERN’s director-general, Rolf Heuer, signed a new enlarged agreement to continue and enhance their co-operation in the field of high-energy physics.

Gravitational lensing constrains cosmology

When two sources at various distances happen to be aligned on the line of sight, the gravitational field of the nearer galaxy distorts the image of the distant galaxy into multiple arc-shaped images (CERN Courier April 2008 p11). The distortion can even form a complete ring if the alignment is perfect and the lensing galaxy has an almost spherical shape. This effect is called an “Einstein ring” because the image distortion results from bending of the light path by the curvature of space–time around the lensing galaxy, as predicted by Albert Einstein’s theory of general relativity.

Gravitational lensing has proved a powerful tool to detect Earth-like extrasolar planets, measure the distribution of dark matter and discover the most distant galaxies (CERN Courier March 2006 p12, CERN Courier January/February 2007 p11, CERN Courier April 2008 p11). Now, Sherry Suyu of the Argelander Institute for Astronomy in Bonn and an international team of collaborators have demonstrated with a detailed study of the gravitational lens B1608+656 that gravitational lensing can also measure the age and composition of the universe with an accuracy that is comparable to other methods (Suyu et al. 2010). The study is based on work published a year ago and on radio-monitoring observations taken between 1996 and 2000 (Suyu et al. 2009, Fassnacht et al. 2002). The latter measurements by the Very Large Array in New Mexico allowed the determination of differences in the lengths of the bent light paths for each of the four apparent images of the background galaxy. These differences are inferred from the time delay of variations observed in each of the four views of the same lensed radio source located in the distant galaxy.

Apart from precise time-delay determinations, there is another difficulty to master before being able to derive accurate measurements of cosmological parameters by gravitational lensing: the modelling of the mass distribution of the lensing galaxy. This was particularly difficult in the case of B1608+656 because the lens is produced by a pair of galaxies, but it proved possible thanks to a high-resolution image taken by the Hubble Space Telescope. Finally, the team used a Bayesian statistical approach to develop a complete description of the lens by combining the Hubble image, stellar-velocity dispersion measurements and the time delays between the multiple images.

This detailed analysis of the gravitational lens B1608+656 over several years yields accurate determinations of cosmological parameters, especially when combined with constraints from the five-year measurements by the Wilkinson Microwave Anisotropy Probe and assuming a flat spatial geometry. The Hubble constant is determined to be 69.7 + 4.9/–5.0 km s–1 Mpc–1 and the equation of state parameter to be w = –0.94 +0.17/–0.18 (where w = –1 corresponds to a cosmological constant). These uncertainties are similar to those obtained with baryon acoustic oscillation data, showing that the gravitational-lens technique is now sufficiently mature to compete with other cosmological probes. In the near future, repeated surveys of the sky should detect several tens of suitable gravitational lenses for similar studies and hence further constrain cosmology through the careful analysis of the effects of space–time deformations.

The first capacitative touch screens at CERN

CCtou1_04_10

“The proton synchrotron currently being built by CERN (the SPS) will be controlled centrally from three control desks, each with its own minicomputer. Only a few knobs and switches must control all of the many thousands of digital and analogue parameters of the accelerator, and an operator will watch the machine on at most half-a-dozen displays … An advantage of the new form of control is that since there are so few controls and displays, they may be made more elaborate and powerful.”

Thus begins a CERN report written in May 1973 by Frank Beck and Bent Stumpe of the controls group (Beck and Stumpe 1973). It describes two devices: the touch screen and the computer-controlled knob. CERN’s member states had approved the construction of the Super Proton Synchrotron (SPS) in February 1971. With its circumference of nearly 7 km, it was a giant machine for its day – some 10 times the size of the Proton Synchrotron (PS) that had started up in 1959. The scale of the new machine meant that control via individual cables linking directly to a central control room – as was done for the PS – would be economically unfeasible. One of the first tasks of the nascent SPS controls group, therefore, was to find a practical and economical solution.

The timing was just right for developing central control supported by computers. Industry was beginning to commercialize minicomputers, so the idea began to take shape of equipping sectors locally with minicomputers controlled by message transfer from the central control room. This would overcome the enormous requirement for cables. The next question was how to create an “intelligent” system based on minicomputers to replace the thousands of buttons, switches and oscilloscopes that a conventional control system would need for a machine as large as the SPS.

CCtou2_04_10

A human has only two hands, but if control devices could be redefined fast enough by computer, then only one button (or knob or pointing device) would be needed to do the job of controlling many different devices or parameters. The main uses of the “master button” would be to select accelerator subsystems for control and monitoring, as well as to select from hundreds of analogue signals the ones to show on displays at any one time. The minicomputers made by Norsk Data at the time seemed to be powerful enough for such a system.

Frank Beck, who was to become head of the SPS Central Controls, was aware of the possibilities offered by existing touch screen technology in which a panel of buttons with labels written by computer can be changed simply by touch to control different aspects of a system. By presenting successive choices that depend on previous decisions, the touch screen would make it possible for a single operator to access a large look-up table of controls using only a few buttons.

CCtou3_04_10

It was clear that the only practical way to create buttons with variable labels by computer at that time was on a cathode-ray tube (CRT) screen. The question then was how the computer could detect which button was being selected. The rather complicated mechanical designs that existed did not seem suitable for the SPS control system. For example, David Fryberger and Ralph Johnson at SLAC had invented a device based on acoustic waves – Rayleigh waves – travelling in the surface of a sheet of glass, which had already been used for accelerator control (Fryberger and Johnson 1971). This worked but required a bulky frame around the screen. Beck discussed this with his colleague Stumpe, from the Data Handling Division, and asked if he could suggest a better technical solution.

In a handwritten note dated 11 March 1972, Stumpe presented his proposed solution – a capacitative touch screen with a fixed number of programmable buttons presented on a display. It was extremely simple mechanically. The screen was to consist of a set of capacitors etched into a film of copper on a sheet of glass, each capacitor being constructed so that a nearby flat conductor, such as the surface of a finger, would increase the capacity by a significant amount. The capacitors were to consist of fine lines etched in copper on a sheet of glass – fine enough (80 μm) and sufficiently far apart (80 μm) to be invisible (CERN Courier April 1974 p117). In the final device, a simple lacquer coating prevented the fingers from actually touching the capacitors.

Stumpe was immediately recruited into the controls group to develop the necessary hardware and the first capacitor to prove that the idea worked was produced at CERN in 1973. Chick Nichols was able to use ion-sputtering equipment available in one of the workshops to evaporate a fine layer of copper or gold on a flexible, transparent Mylar sheet to make the first working device. A prototype glass screen with nine touch buttons followed soon after.

The fineness of the lines and their pitch meant that a great deal of care was needed to produce the screen, but it turned out to be possible with the techniques normally used to make printed circuit boards. At first, placing the copper layer on the glass appeared difficult and it proved impossible to get reliable adherence with vacuum deposition. However, ion sputtering gave better results. By ensuring that the glass was scrupulously clean and by depositing the copper slowly – an hour for a layer of about 10 μm – it was possible to get adherence strong enough to allow soldered connections to the glass.

The capacitance of each button was about 200 pF, increasing by about 10% when a finger came close. The method chosen to detect the change in capacitance was to use a phase-locked oscillator circuit, which had recently become available as a single integrated-circuit chip. One circuit acted as a reference oscillator, while each button had a similar circuit. The oscillator attached to a button locked in to the frequency of the reference oscillator (120 kHz), so that a change in capacity altered the phase but not the frequency. The phase shift was converted to a voltage shift, which indicated that the button had been touched. The circuit was very immune to noise and transients. Moreover, any drifts would be common to both oscillators, so good thermal stability could be obtained with commercial components.

Into production

CCtou4_04_10

As soon as it was clear that the system could successfully recognize which of the nine buttons was touched, Beck showed the prototype to those in charge of the SPS project. Even before reliability tests had been performed, the decision was taken to use the touch-screen system and begin development of the control software on the first minicomputers (Nord 1, and later Nord 10) that CERN had received from Norsk Data. This was definitely a risk, but had the decision not been made then the control group would have had no option but to use conventional technology for central control of the SPS. Tests later proved the reliability of the technique.

The next step was the development of a more practical touch screen with 16 buttons. The new central SPS control room needed several devices and industry was soon involved. Manufacturing of the touch screen itself proceeded in collaboration with a Danish company, Ferroperm. This led to the development of a robust glass screen with reduced surface reflections. At the same time another Danish company, NESELCO, became involved in producing the electronic modules needed to drive the touch screen.

CCtou5_04_10

When the SPS started up in 1976 its control room was fully equipped with touch screens – apparently the first application of the capacitative touch screen in the world. Touch screens later took their place in modernized control systems for the PS, which had preceded the SPS by nearly 20 years, as well as for the subsequent and much bigger Large Electron Positron collider. Some of these screens continued to operate until the new CERN Control Centre took over operations in 2006 – a lifetime of 30 years.

In 1977 CERN demonstrated the potential of the new touch screen for industrial control in no lesser a place than the huge and famous Hanover Fair. In the hall for new industrial inventions, CERN presented the “Drinkomat”, with a complete operational console similar to the one used to control the SPS, including a Nord 10 computer. The system was built by Alain Guiard, who at the time was using a touch screen to control a large film-development installation at CERN, which allowed exact control of the liquids used in the process. Through multiple choices on a touch screen, the Drinkomat allowed people to mix drinks and follow the process visually, foreshadowing the machines that came into CERN’s cafeterias nearly 30 years later.

CCtou6_04_10

By 1977 the capacitative touch screen was already available commercially and being sold to other users within CERN and to other research institutes and companies wishing to use the screens in their own control systems (Crowley-Milling 1977). Its use spread around the world: JET and the Rutherford Laboratory in the UK; KEK, Mitsubishi and the TOYO corporation in Japan; the Rigshospitalet in Denmark and the Hahn-Meitner Institute in Germany.

One reason behind the success of the system was a decision at CERN to build electronic modules in the CAMAC system, used not only all over CERN but throughout the world. This made it easy for users to buy individual modules for integration into their own systems. By 1980, more than 10 different CAMAC modules developed at CERN had been brought to the market by NESELCO. Furthermore, a CAMAC module with an integrated computer for driving the touch screen was developed in 1977, shortly followed by a CAMAC crate computer using the Motorola 68000 microprocessor. These modules were integrated into an intelligent “Touch Terminal”, which was commercialized by NESELCO in 1980; it was the world’s first commercial touch-screen computer.

At CERN the Touch Terminal was used for the control of the Antiproton Accumulator, which allowed the SPS to become a proton–antiproton collider and gather fame for CERN through the discovery of the W and Z bosons and the subsequent awarding of the Nobel prize to Carlo Rubbia and Simon van der Meer.

CCtou7_04_10

The original touch screen had only 16 fixed “buttons” associated with distinct areas of the screen, but already in 1978 it was obvious that a more flexible arrangement for dividing up the screen would have many advantages. Stumpe developed his original concept to create an X–Y touch screen, in which the idea was to sense the position touched via two layers of capacitors corresponding to X and Y co-ordinates. Following prototype work at CERN, development began with NESELCO and the University of Aarhus, supported by the Danish state development funds. The X–Y screen involved new techniques for metallization on various substrates, which became the subject of patent rights. Stumpe was asked to sign a nondisclosure agreement, which he refused to do because CERN required that all inventions should be published. At this point, CERN’s involvement with the further development of touch screens came to an end.

The new CERN Control Centre (CCC), which oversees the control of CERN’s entire accelerator complex, including the PS, SPS and now the LHC, has no touch screens for accelerator control. Today the use of the ubiquitous mouse as a pointing device provides the same type of computer control. Moreover, PC-based systems with standard displays are inexpensive and easy to install. In 1972, when the touch screen was developed at CERN for controlling the new SPS, the situation was different: nothing was commercially available and every control device had to be invented, including the colour displays.

However, touch screens are undoubtedly not absent from the CCC, as the operators often communicate with colleagues by mobile phones with capacitative touch screens. The idea invented at CERN in 1972 has been reinvented in many applications, from “Drinkomats” to rail and airline ticket machines to the multifunction phones that sit in many pockets – not only in the CCC but all around the world.

Creativity and intellect: when great minds meet

CCmil1_04_10

At the City College of New York, Arthur I Miller took large doses of philosophy in addition to physics. This was the start of a path that would lead him to become a well known historian of science and acclaimed author. He earned a PhD in physics at the Massachusetts Institute of Technology and went on to do research in theoretical particle physics. He soon became fascinated with the history of ideas and the role of visual thinking in highly creative research.

In 1991 Miller moved to England where he became professor of history and philosophy of science at University College London. Three years later he founded the Department of Science and Technology Studies, which grew out of the original Department of History and Philosophy of Science. He has lectured and written extensively about his research into the history and philosophy of 19th- and 20th-century science and technology, as well as about cognitive science, scientific creativity and the relationship between art and science.

He is the author not only of academic books but also of several widely acclaimed books meant for a wider audience, including Einstein, Picasso: space, time and the beauty that causes havoc (2001), nominated for the Pulitzer Prize, and Empire of the Stars: friendship, obsession and betrayal in the quest for black holes (2005). In December he visited CERN to give a colloquium on his latest book, Deciphering the Cosmic Number: the strange friendship of Wolfgang Pauli and Carl Jung (2009).

When did your interest in interdisciplinary studies start?

Even though physics was what I focused on at university, my passion has always been those pesky “what is the nature of” questions, such as “what is the nature of charge, of mass, of space, of time, of the mind, and so on”. I wanted to understand how scientists made discoveries and how the mind works. Looking into the original German-language papers written by giants of 20th-century physics such as Albert Einstein, Niels Bohr, Werner Heisenberg and Wolfgang Pauli, I came to understand the important role of visual imagery in scientific discovery. I decided to look into this further. I became curious as to how images were generated and stored in the mind, to be called out and used in thinking. I turned to cognitive science, which gave me the means to structure my ideas. This led to my investigation into concepts such as aesthetics, beauty, intuition and symmetry, and how they are used in science and art.

What intrigued you about the lives of Albert Einstein and Pablo Picasso?

The most important scientist of the 20th century, Albert Einstein, and its most important artist, Pablo Picasso, went through their period of greatest creativity and achievements around the same time, and in similar circumstances. In 1905 Einstein discovered his theory of relativity and in 1907 Picasso discovered Les Demoiselles d’Avignon, the painting that brought art into the 20th century and that contains the seeds of cubism. Even though they did not know about each other, they were both – each in his own way – identifying connections across the so-called “two cultures” of science and art, and striving to find a solution to the question of how to represent the nature of space and time in a more satisfying manner.

At the beginning of the 20th century, it was in the air that revolutionary changes were about to occur in many fields. Yet some of the greatest thinkers of the period bucked this tide. The great French philosopher-scientist Henri Poincaré was one of them. To my surprise, he turned out to be a common denominator between Einstein and Picasso. Both men were inspired by his book, Science and Hypothesis. Poincaré failed because he was unable to rid himself of the notion that time was an absolute and not a relative quantity. Just the opposite of what Einstein found when he combined space and time into a single continuum – space–time – and what Picasso did in his cubism, when he represented multiple perspectives all at once on a single canvas. Einstein studied temporal simultaneity, Picasso spatial simultaneity.

Is there a relationship between historical periods and people’s achievements?

Definitely. At that time, people were responding, with different degrees of success, to the mysterious synchronous effects of the Zeitgeist – the avant-garde, the intellectual tidal wave that swept across Europe. In fact, it was not an accident that Einstein and Picasso worked on the same problem – the nature of space and time. It was the principal problem of the avant-garde. In 1902, two years after his graduation from the ETH, Einstein was employed at the Swiss Federal Patent Office, in Bern, and was out of the academic mainstream. Picasso, on the other hand, was in Paris, in the centre of things. Most scientists thought that Poincaré would make major breakthroughs in physics, although of a sort that supported the claims of Newtonian science regarding space and time. Most artists in Paris considered that André Derain, Henri Matisse’s star student, was the one who would make the breakthrough to a radically new conceptual art.

Just as Poincaré could not break away from classical thought, Derain did not take seriously the dazzling developments in science, technology and mathematics. Only Picasso and Einstein were in resonance with the drum beat of the avant-garde. To accomplish their breakthroughs both men realized that they had to discover a new aesthetic: for Picasso it was the reduction of forms to geometry; for Einstein it was a minimalist aesthetic, which allowed him to remove “asymmetries that do not appear to be inherent in the phenomena”, as he wrote in the first sentence of his 1905 relativity paper. At their creative moment boundaries between disciplines dissolved and aesthetics became paramount for both of them.

What criteria do you use to compare people in your books?

I look for parallelisms in the working and private lives of highly creative thinkers (Einstein and Picasso). Pairs in opposition are of interest to me in what they say about the human element in science (Chandrasekhar and Eddington) or in a situation in which each learns from the other (Pauli and Jung). For example, Pauli was able to understand the forces that drove his personal life as well as his creative verve. In fact, an important discovery of his – CPT symmetry – stemmed from a dream that he and Jung analysed using Jungian psychoanalysis. Jung learnt enough quantum physics from Pauli to bring to fruition one of his greatest ideas – synchronism.

What can you say about high creativity?

Highly creative researchers are not deterred by mistakes and failures. Rather, they learn from them and turn the situation to their advantage. J Robert Oppenheimer once gave a particularly interesting definition of an expert as “a person who has made all possible mistakes”. Some other hallmarks of high creativity are that early in life the highly creative person realizes the field in which he or she is most competent and then mines it. They also exhibit an almost frighteningly focused mind when they work on a problem, to the exclusion of all else. Such was the case with Einstein and Picasso.

Is intuition part of creativity and the intellectual process?

I think that it is in both. There is nothing mysterious about intuition. It comes about mainly through an accumulation of knowledge. People can make an evaluation within a fraction of a second just because they have a lot of experience behind them. Having an intuition for what to do, solving a problem, judging a work of art, means having made a lot of errors and judgements along the way. Intuition is an achievement, albeit with a bit of the irrational mixed in – just like in scientific discovery. I think that there is not much difference between artistic thinking and scientific thinking, even if sometimes scientists want to appear less emotional and artists less rational.

Of course, an objective truth exists – on this every scientist would agree, even in this era of multiverses. There is a real external world “out there” beyond appearances and science is a way of getting a glimpse of it. Today, scientists have only begun to explore concepts like consciousness. One of the reasons I wrote my book about Jung and Pauli was to bring to everyone’s attention the high level of their discussions about issues that spanned physics, psychology, biology, religion, ESP, UFOs and Armageddon. They realized that neither physics nor psychology alone could reply to such deep questions such as: “What is the nature of consciousness?” Only an interdisciplinary approach could succeed.

What can you say about interdisciplinary research today?

Beginning in about the 1980s it became evident that, for example, biology needed various forms of technology – and also mathematics and physics. The need for interdisciplinarity soon became evident for physics as well, especially with the advent of health physics, computing physics, nanotechnology and then developments in biology. Nevertheless, most universities maintain a departmental structure and consequently a lack of complete interdisciplinarity. Moreover, there are too many instances where students with a PhD in an interdisciplinary topic have problems in obtaining a job.

One of the stumbling blocks here is the need for a common language across different domains. This lack of communication makes people afraid of an outsider interfering in their field. When I was writing my book on Einstein and Picasso I found that, whereas in most cases artists were easy to deal with, not so for historians of art. Their post-modernistic jargon necessarily closes them off from an interdisciplinary approach. Most of them still consider Picasso’s discovery of cubism to have been rooted in African art and the art of Cézanne, ignoring the essential role of science, technology and mathematics in his thinking. Picasso’s stunning discovery of cubism formalized the formerly informal language of art and brought it back into contact with science, where it has been ever since.

• For the video of the colloquium by Arthur I Miller, “The strange friendship of Pauli and Jung – when physics met philosophy”, see http://cdsweb.cern.ch/record/1228081.

Particle physics INSPIREs information retrieval

CCins1_04_10

Particle physicists thrive on information. They first create information by performing experiments or elaborating theoretical conjectures. Then they convey it to their peers by writing papers that are disseminated in a preprint form long before publication. Keeping track of this information has long been the task of libraries at the larger laboratories, such as at CERN, DESY, Fermilab and SLAC, as well as being the focus of indispensable services including arXiv and those of the Particle Data Group.

It is household knowledge that the web was born at CERN, and every particle physicist knows about SPIRES, the place where they can find papers, citations and information about colleagues. However, not everyone knows that the first US web server and the first database on the web came about at SLAC with just one aim: to bring scientific information to the fingertips of particle physicists through the SPIRES platform. SPIRES was hailed as the first “killer” application of the then nascent web.

No matter how venerable, the information tools currently serving particle physicists no longer live up to expectations and information management tools used elsewhere in the world have been catching up with those of the high-energy physics community. The soon to be released INSPIRE service will bring state-of-the-art information retrieval to the fingertips of researchers in high-energy physics once more, not only enabling more efficient searching but paving the way for modern technologies and techniques to augment the tried-and-tested tools of the trade.

Meeting demand

The INSPIRE project involves information specialists from CERN, DESY, Fermilab and SLAC working in close collaboration with arXiv, the Particle Data Group and publishers within the field of particle physics. “We separate the work such that we don’t duplicate things. Having one common corpus that everyone is working on allows us to improve remarkably the quality of the end product,” explains Tim Smith, head of the User and Document Services Group in the IT Department at CERN, which is providing the Invenio technology that lies at the core of INSPIRE.

In 2007, many providers of information in the field came together for a summit at SLAC to see how physics-information resources could be enhanced. The INSPIRE project emerged from that meeting and the vision behind it was built from a survey launched by the four labs to evaluate the real needs of the community (Gentil-Beccot et al. 2008.). A large number of physicists replied enthusiastically, even writing reams of details in the boxes that were made available to input free text. The bulk of the respondents noted that the SPIRES and arXiv services were together the dominant resources in the field. However, they pointed out that SPIRES in particular was “too slow” or “too arcane” to meet their current needs.

INSPIRE responds to this directive from the community by combining the most successful aspects of SPIRES (a joint project of DESY, Fermilab and SLAC) with the modern technology of Invenio (the CERN open-source digital-library software). “SPIRES’ underlying software was overdue for replacement, and adopting Invenio has given INSPIRE the opportunity to reproduce SPIRES’ functionality using current technology,” says Travis Brooks, manager of the SPIRES databases at SLAC. The name of the service, with the “IN” from Invenio augmenting SPIRES’ familiar name, underscores this beneficial partnership. “It reflects the fact that this is an evolution from SPIRES because the SPIRES service is very much appreciated by a large community of physicists. It is a sort of brand in the field,” says Jens Vigen, head of the Scientific Information Group at CERN.

However, INSPIRE takes its own inspiration from more than just SPIRES and Invenio. In searching for a paper, INSPIRE will not only fully understand the search syntax of SPIRES, but will also support free-text searches like those in Google. “From the replies we received to the survey, we could observe that young people prefer to just throw a text string in a field and push the search button, as happens in Google,” notes Brooks.

This service will facilitate the work of the large community of particle physicists. “Even more exciting is that after releasing the initial INSPIRE service, we will be releasing many new features built on top of the modern platform,” says Zaven Akopov of the DESY library. INSPIRE will enable authors and readers to help catalogue and sort material so that everyone will find the most relevant material quickly and easily. INSPIRE will also be able to store files associated with documents, including the full text of older or “orphaned” preprints. Stephen Parke, senior scientist at the Fermilab Theory Department looks forward to these enhancements: “INSPIRE will be a fabulous service to the high-energy-physics community. Not only will you be able to do faster, more flexible searching but there is a real need to archive all conference slides and the full text of PhD theses; INSPIRE is just what the community needs at this time.”

CCins2_04_10

Pilot users see INSPIRE already rising to meet these expectations, as remarked on by Tony Thomas, director of the Australian Research Council Special Research Centre for the Structure of Matter: “I tried the alpha version of INSPIRE and was amazed by how rapidly it responded to even quite long and complex requests.”

The Invenio software that underlies INSPIRE is a collaborative tool developed at CERN for managing large digital libraries. It is already inspiring many other institutes around the world. In particular, the Astrophysics Data System (ADS) – the digital library run by the Harvard-Smithsonian Center for Astrophysics for NASA – recently chose Invenio as the new technology to manage its collection. “We can imagine all sorts of possible synergies here,” Brooks anticipates. “ADS is a resource very much like SPIRES, but focusing on the astronomy/astrophysics and increasingly astroparticle community, and since our two fields have begun to do a lot of interdisciplinary work the tighter collaboration between these resources will benefit both user communities.”

Invenio is also being used by many other institutes around the world and many more are considering it. “In the true spirit of CERN, Invenio is an open-source product and thus it is made available under the GNU General Public Licence,” explains Smith. “At CERN, Invenio currently manages about a million records. There aren’t that many products that can actually handle so many records,” he adds.

Invenio has at the same time broadened its scope to include all sorts of digital records, including photos, videos and recordings of presentations. It makes use of a versatile interface that makes it possible, for example, to have the site available in 20 languages. Invenio’s expandability is being exploited to the full for the INSPIRE project where a rich set of back-office tools are being developed for cataloguers. “These tools will greatly ease the manual tasks, thereby allowing us to get papers faster and more accurately into INSPIRE,” explains Heath O’Connell from the Fermilab library. “This will increase the search accuracy for users. Furthermore, with the advanced Web 2.0 features of INSPIRE, users will have a simpler, more powerful way to submit additions, corrections and updates, which will be processed almost in real time”.

Researchers in high-energy physics were once the beneficiaries of world-leading information management. Now INSPIRE, anchored by the Invenio software, aims once again to give the community a world-class solution to its information needs. The future is rich with possibilities, from interactive PDF documents to exciting new opportunities for mining this wealth of bibliographic data, enabling sophisticated analyses of citations and other information. The conclusion is easy: if you are a physicist, just let yourself be INSPIREd!

• The INSPIRE service is available at http://inspirebeta.net/.

CAST’s first decade of solar-axion research

In 1983, when I was thinking about how axions may be produced and detected by their conversion to photons in a magnetic field, it struck me suddenly that there is no need to produce axions because the Sun does that for us. The solar axion flux is much larger than any that we could produce on Earth, and it is here free of charge. Our job is simply to detect these solar axions.

– Pierre Sikivie of University of Florida.

CCcas1_04_10

Axions are one of the favoured candidates for the mysterious dark matter created in the early universe. A variety of observatories located on Earth and in outer space form a quasi-network that can target specific places in the search for these particles, such as the galactic centre, the inner Earth and the Sun’s hot core. The CERN Axion Solar Telescope (CAST) points at the Sun – its aim being the direct detection of axions or other exotic particles with similar properties.

While relic axions from the early universe should propagate with a velocity of about one thousandth of the speed of light, solar axions – with a broad spectral shape of around 4–5 keV kinetic energy – are relativistic. The open window for the axion rest mass is currently in the micro-electron-volt to electron-volt range. The several orders of magnitude difference in kinetic energy associated with the two origins make for different experimental search techniques: microwave cavities for relic axions versus X-ray detectors for solar axions. However, both techniques use a magnetic field as the catalyst that allows axions to become photons

Accelerator laboratories, with their powerful magnets are natural locations for axion helioscopes – the instruments used to search for axions from the Sun. The first experiment to look at the Sun, which incorporated a 2.2-m iron-core magnet, was set up by a Rochester-Brookhaven-Fermilab (RBF) collaboration in the early 1990s. It was followed by the Sumico experiment based on a 2.3-m long superconducting magnet at the University of Tokyo, which is still in operation. The CAST helioscope at CERN uses a decommissioned LHC-dipole test magnet, with a field of 9 T and two tubes – originally designed to house the beam pipes – that are 9.2 m long and have an aperture of 43 mm. The dipole is one of four original prototypes and was rescued at the last minute before it was about to be scrapped along with the others. A comparison of CAST’s performance with its two predecessors in Brookhaven and Tokyo shows that the LHC magnet was good choice.

The possibility that a bending magnet could be used to make visible the “dark” Sun was – and still is – inspiring and motivating. To transform the multi-tonne superconducting, superfluid-helium-cooled magnet from a static LHC prototype dipole into a helioscope that can track the Sun with millimetre precision involved delicate engineering work and cryo-expertise. Thankfully, Louis Walckiers in the Accelerator Technology Division supported the idea, even though we had both just failed to prove with the same magnet that the biomechanics of cell-structure formation becomes confused in a 9 T environment.

Recycling space technology

CCcas2_04_10

Position-sensitive X-ray detectors of the MicroMegas type, invented by Georges Charpak and Ioannis Giomataris at CERN, now cover three of the ends of the tubes through the magnet, making CAST the only axion helioscope to have implemented such technology. For the fourth exit, together with Dieter Hoffmann and Joachim Jacoby of TU Darmstadt we were able to recover an excellent X-ray imaging telescope from the German space programme, which was delivered by Heinrich Bräuninger from the Max Planck Institute for Extraterrestrial Physics in Garching. With state-of-the-art X-ray optics and low-noise X-ray pixel detectors at the focal plane, this not only improves the signal-to-noise ratio substantially but also allows for the unambiguous identification of the axion signal. Its CCD imaging camera simultaneously measures the expected solar-axion signal spot and the surrounding background. This is an important feature that makes CAST unique as an axion helioscope. With most of the components located, CAST received formal approval at CERN in April 2000.

In the same way that much of the CAST equipment was recycled from particle physics so, too, was its working principle: the Primakoff effect, known since 1951, which regards the production of neutral pions by the interaction of high-energy photons with the high electric field of the nucleus as the reverse of the decay into two photons. The expectation is that the quasi-stable axion should “decay” in the presence of a magnetic field into a photon emitted exactly along the axion’s trajectory. In principle this allows for a perfect axion telescope thanks to the spatial resolution of the X-ray telescope.

The Primakoff effect deserves to be a textbook example of macroscopic quantum-mechanical coherence, which, in astrophysical magnetic fields, can extend over kiloparsecs – although only for very small axion rest masses. For CAST, coherence holds over the whole length of the magnet, around 9 m, provided that the particle rest mass is below 0.02 eV/c2 when the two pipes are vacuum-pumped. To extend the detection sensitivity to higher masses, adding a certain amount of helium as a refractive gas to the 1.8 K cold magnetic pipes restores coherence for a rest mass up to around 1 eV/c2 from a few millimetres up to 9 m but for a narrow range in solar axion rest mass. With this adaptation, suggested in 1988 by two collaboration members Karl van Bibber and Georg Raffelt, and implemented during 2005 and 2006, CAST has become a scanning experiment. The rest-mass range for solar axions that will be scanned by the end of 2010 fits the cosmologically derived upper limit of about 1 eV/c2, from the Wilkinson Microwave Anisotropy Probe (WMAP) data, and the lower limit around 1 μeV/c2, which arises because axions with lower rest mass would be produced earlier in the early universe, with a total mass exceeding that of the critical density (“overclosure”).

CCcas4_04_10

The precise pressure settings for the helium gas and controlled changes in the very cold magnet pipes are highly demanding and are not without risk. CAST has benefited greatly from CERN’s world-class cryogenic expertise in this respect, with its reliable user-friendly gas system designed by Tapio Niinikoski and his PhD student Nuno Elias. At present an extensive thermodynamic simulation is being performed with the aim of reconstructing the changing conditions of the helium gas as the magnet tracks the Sun. For example, to achieve the homogeneity in gas density necessary to keep coherence, the temperature variations along the 9-m long pipes should be in the milli-kelvin range; this is made possible by the surrounding bath of superfluid liquid helium at about 1.8 K.

CAST is also a “special” experiment when compared with others because its highly sensitive magnet and low-background detectors must operate while in motion, even though the speed of about 2 m an hour is almost imperceptible. In addition, CAST’s equipment must withstand quenches of the superconducting magnet. After each quench the gas control system must cope with extreme conditions within seconds. However, during 15,000 hours of operation with the magnet on, and more than 2000 hours of solar tracking, CAST has survived potentially catastrophic events because its safety features have – thanks to the careful work of CERN’s Martyn Davenport – never failed simultaneously.

Scientific return

CCcas3_04_10

While CAST has failed so far to find direct evidence for solar axions, it has been able to provide new robust limits on the interaction of solar axions with a magnetic field, i.e. the sea of virtual photons (figure 1). Its experimentally derived limit dominates the relevant phase space and competes with the best astrophysically derived lower value for the coupling constant, g. CAST is now moving into a theoretically motivated region, having almost fulfilled the original expectations set a decade ago with all of the input uncertainties at that time.

Moving beyond the initial proposal, CAST has in parallel explored – for the first time for a solar axion search – the region of high-energy solar axions, following the proposal of collaboration member Juan Collar. It has also made the first measurements below 1 keV, covering so far the range of around 1–3 eV. Moving to energies above this is possible; however, it will require larger energy steps and some new state-of-the-art detector technology to explore this interesting energy region that covers of most Sun’s puzzling X-ray activity.

Without detecting any solar-axion signature so far, the question arises: what is the scientific return from CAST? Certainly, the first benefit is educational, with students completing some 10 PhD theses and an equal number of diploma theses. There have also been several CAST summer students at CERN. On the research side, CAST has helped to revive axion activities around the world, fitting between pure axion searches in the laboratory and a variety of astrophysical/cosmological observatories that usually did not have axions in their original list of objectives. The state-of-the-art detectors in these observatories cover photon energies from micro-electron-volts upwards. With CAST, the implementation of X-ray optics in axion helioscopy has become widely accepted as a necessary ingredient for future scaled-up versions.

While CAST’s results have became a reference in the relevant field, they have also been used by other teams to search, for example, for “paraphotons” – sterile massive photons from the “hidden sector”. Furthermore, two members of the CAST collaboration, Milica Krĉmar and Biljana Lakić, have used the experiment’s results to explore theories of large extra dimensions, which predict “massive” axions of the Kaluza-Klein type. Interestingly, such massive exotica could be gravitationally trapped in the Sun and could build a bright halo, as a result of their spontaneous decay, as we have suggested with Luigi Di Lella of CERN.

The axion signal that the CAST collaboration aims to observe while tracking the Sun consists of excess X-rays emerging from the magnet tubes. Interestingly, there is abundant solar X-ray emission of otherwise unknown origin, which is further enhanced just above the magnetized photosphere. For more than 70 years, known physics has failed to explain this intriguing behaviour, which could, however, arise from the conversion or decay of axions or other similar exotica near the Sun’s restless surface. The outermost solar layers, i.e. the photosphere, might act occasionally as scaled-up and highly effective catalysts of axions or similar particles, emitting large numbers of X-rays (like a fine-tuned CAST might do one day). Then, extending Sikivie’s original idea, the otherwise mysterious solar surface makes these axions visible as X-rays. New X-ray observatories in space are already providing more and more exciting evidence that something new and interesting is going on in the Sun’s outer layers. The complete axion scheme may make the Sun even more special than it already is.

Such a solar scenario might eventually point to a “superCAST”, which in 5 to 10 years may well make the present CAST look like an old fashioned miniature device – provided that Sikivie’s pioneering idea behind CAST is not replaced by a novel conceptual design. For example, together with Andrzej Siemko of CERN we have proposed using a quadrupole magnet as a potentially better axion catalyst than the dipole magnets used at present in almost all axion experiments. This idea, which was also discussed theoretically by Eduardo Guendelman in 2008, is motivated observationally because otherwise puzzling solar X-ray activity correlates not only with magnetic fields but even more with places of varying field vector.

Alvaro De Rújula commented in 1998 that “axion searches are mandatory, fun, creative – and proceeding”. His words are just as true today, as the CAST project continues into its second decade.

• I am very grateful to all members of the CAST collaboration, to CERN for its hospitality and support, including the librarians, and to my colleagues at the University of Patras for their real help.

This article is dedicated to the memory of the following members of the CAST collaboration who have sadly passed away since the project’s inception: Engin Abat, Engin Arik, Fatma Senel Boydag, Ozgen Berkol Dogan, Angel Morales and Julio Morales.

Murray Gell-Mann: my contemporary and friend

CCmur1_04_10

Murray Gell-Mann and I were born a few days apart in September 1929. Being born on almost the same date as a genius does not help much, except for the fact that by having the same age there was a non-zero probability that we would meet. And indeed this is what happened; furthermore, we and our families became friends. Because I was unable to attend the meetings in honour of Murray, I am making this testimony on the occasion of his 80th birthday.

Murray’s family was much affected by the crash of October 1929. His father had to change jobs completely. If this had not happened, it is possible that Murray might have become a successful businessman instead of a brilliant physicist. Everybody knows that Murray is immensely cultured and has multiple interests. I can quote a few at random: penguins, other birds (tichodromes for instance), Swahili, Creole, Franco-Provençal (and more generally the history of languages), pre-Columbian art and American-Indian art, gastronomy (including French wines and medieval food), the history of religions, climatic change and its consequences, energy resources, protection of the environment, complexity, cosmology and the quantum theory of measurement. However, it is in the field of theoretical particle physics that he made his most creative and important contributions. For these, I personally consider him to be the best particle-physics theoretician alive today.

Bright beginnings

I met Murray for the first time at Les Houches in 1952, one year after the foundation of the school by Cecile Morette-DeWitt. It was immediately obvious that he was extremely bright. Then he was invited by Maurice Levy to the Ecole Normale and gave some lectures at the Institut Henri Poincaré. He gave these in French, which had an amusing consequence as a result of a practical joke by Maurice. For months, as they worked together, speaking French, whenever Murray had said something like “ces deux termes s’annulent” (these two terms cancel) Maurice repeated it, substituting “se chancellent” for “s’annulent.” Now Murray knew that “chanceler” means to wobble and not to cancel, but he finally supposed that in English-influenced French scientific jargon, “chanceler” could mean “to cancel.” Otherwise, why would Maurice keep using that word? When Murray actually employed the word in one of his lectures, Maurice went into paroxysms of laughter.

In 1955 I attended my first physics conference, in Pisa. After a breakfast with Erwin Schroedinger, I took the tram and met Murray. In the afternoon, at the University of Pisa, he made the first public presentation of the strangeness scheme. The auditorium was packed. I was completely bewildered by this extraordinary achievement, with its incredible predictive power (which was very soon checked) including the KK system. I had already left Ecole Normale-Orsay for CERN when he and Maurice wrote their famous paper featuring for the first time what was later called the “Cabbibo angle”.

I then had the luck to be sent to the La Jolla conference in 1961. There I met Nick Khuri for the first time, who became a close friend, and I heard Murray presenting “the Eightfold Way” (i.e. the SU(3) octet model). Also attending were Marcel Froissart, who derived the “Froissart Bound”, and Geoff Chew, who presented his version of the S-matrix programme. Both were most inspiring for my future work. What I did not realize at the time was that the Chew programme had been largely anticipated by Murray, who first was involved in the use of dispersion relations and then noticed, in 1956, that the combination of analyticity, unitarity and crossing symmetry could lead to field theory on the mass shell, with some interesting consequences (as exemplified by Froissart’s work and by my later work on the subject).

In 1962, during the Geneva “Rochester” conference, I was again present when Murray, after a review of hadron spectroscopy by George Snow, stood up and pointed out that the sequence of particles Δ, Σ*, Ξ* could be completed by a particle that he called Ω to form a decuplet in the SU(3) scheme. He predicted its mode of production, its decay, which was to be weak, and its mass. This was followed by a period of deep scepticism among theoreticians, including some of the best. However, at the end of 1963, while I was in Princeton, Nick Samios and his group at Brookhaven announced that the Ω had been discovered, with exactly the correct mass within a few mega-electron-volts. Frank Yang, one of the sceptics, called it “the most important experiment in particle physics in recent years”. I missed the invention of the quarks, being in Princeton, far from Caltech, where Murray was, and from CERN where George Zweig was visiting. I met Bob Serber but I was completely unaware of his catalytic role in that discovery.

Close friends

My next important meeting with Murray was in Yerevan in Armenia in 1965, where Soviet physicists had invited a group of some eight western physicists. This time Murray came with his whole family: his wife, Margaret – a British archaeology student whom he met in Princeton – and his children, Lisa and Nick. During the following summer, which the Gell-Manns spent in Geneva, our families met several times. I remember once when my children, seeing a portrait of Lisa by the famous Armenian painter H Galentz, said: “This is a green Lisa.” The Gell-Manns spent another year at CERN before Harold Fritzsch, Gell-Mann, and Heinrich Leutwyler wrote the “Credo” of QCD.

Margaret and Murray came to Geneva again for the academic year 1979/80. They were living in an apartment in the same group of buildings as us. Schu, my wife, then became a close friend of Margaret, who was a typically British girl: very reserved, very intelligent and possessing a good sense of humour. An example of how she was very modest is that, while we knew that she had been digging at Mycenae for an archaeologist named Alan Wace, we found out only long after her death that she had played a personal role in destroying a theory of Sir Arthur Evans, who claimed wrongly that the Cretans had dominated the Mycenaeans during a certain part of the late-Minoan period – while the reverse was true. In fact, she was the first to discover a Linear B tablet at Mycenae. Although Carl Blegen had found Linear B tablets at Pylos long before, finding them at Mycenae as well was important additional evidence once Michael Ventris had proved that the language of Linear B was an early form of Greek, and that Margaret’s boss was right. He had suffered terribly from his refusal to agree with Evans.

An extraordinary friendship grew up between Margaret and Schu. When the Gell-Manns left Geneva for Pasadena, Margaret knew that there was something wrong with her health. Back in the US she discovered that she had cancer. I do not know the number of transatlantic trips that we made – sometimes both of us, sometimes Schu alone – to help Margaret. This included stays in Aspen during the summers of 1980 and 1981. In between, Schu and Margaret had an extensive correspondence. Schu decided to initiate Margaret into French poetry. In particular, she sent Margaret poems by Jacques Prévert and Paul Eluard. On Margaret’s grave, in Aspen, Murray put the inscription: “Mais ou en est ce léger sourire” (Eluard, about Nuesch, his late wife). After Margaret’s death, we all kept in touch because Murray has one remarkable quality: faithfulness in friendship.

• I am grateful to my wife, Schu, and to Murray for suggestions and corrections.

Romania takes first steps to join CERN

CCnew6_04_10

On 11 February the Romanian minister of education, research, youth and sport, Daniel Petru Funeriu, and CERN’s director-general, Rolf Heuer, signed an agreement that formally recognizes Romania as a candidate for accession to membership of CERN.

Romania’s pre-membership will cover a five-year period during which the country’s contributions will increase to normal member-state levels, in parallel with Romania’s participation in CERN projects. At the end of this five-year period CERN Council will decide on Romania’s application for full membership, as the organization’s 21st member state.

Romania entered into direct collaboration with CERN in the early 1990s. In recent years the country has constantly increased its expenditure on R&D, in particular since the country’s accession to the EU in January 2007. Romania is involved in three LHC experiments: ATLAS, ALICE and LHCb. It also contributes to the DIRAC and ISOLDE programmes and to Grid computing.

bright-rec iop pub iop-science physcis connect