Comsol -leaderboard other pages

Topics

Nara workshop looks at heavy quarkonia

The 6th International Workshop on Heavy Quarkonia took place in Nara in December 2008, attracting some 100 participants. It was the latest in a series organized by the Quarkonium Working Group (QWG), a collaboration of theorists and experimentalists particularly interested in research related to the physics of quarkonia – bound states of heavy quark–antiquark pairs (Brambilla et al. 2004). The talks and discussions in three round tables emphasized the latest advances in the understanding of quarkonium production, the discovery of the ηb, the properties of the X, Y and Z narrow resonances, as well as the use of quarkonium states as probes of the QCD matter formed in high-energy nuclear collisions. The meeting ended with a series of talks about how the Antiproton Annihilations at Darmstadt (PANDA) experiment and the LHC experiments should improve and complement present knowledge.

New states

CCqua1_04_09

The nature and properties of the X, Y and Z narrow resonances, recently discovered in B-factories (and thought to be quarkonium states), were extensively discussed at the workshop. Presentations from the Belle, BaBar and CDF collaborations provided new information on the masses, branching ratios, quantum numbers and production properties of these particles. Using approximately 6000 signal events in J/ψ → π+π decays, CDF obtained the most precise determination of the X(3872) mass: 3871.61±0.16(stat.) ±0.19(syst.) MeV/c2, a value extremely close to the D0D*0 mass threshold, 3871.8±0.36 MeV/c2 (figure 1). Given present uncertainties, the interpretation of the X(3872) as a “molecular” D0D*0 bound state remains possible but not compulsory. In addition the CDF collaboration reported a very accurate mass measurement for the Bc± of 6275.6±2.9(stat.) ±2.5(syst.) MeV/c2, obtained by studying the mass spectrum of Bc+ → J/ψπ+ decays (and the charge conjugates). The CDF and D0 experiments also measured the Bc lifetime, through the study of semileptonic decays. The measurements are of comparable precision, leading to a world average lifetime of 0.459±0.037 ps for the only observed charged quarkonium.

CCqua2_04_09

Another hot topic was the BaBar experiment’s discovery of the long-sought-after bottomonium ground state, the ηb. On the basis of a record amount of event samples collected early in 2008 (more than two hundred million U(2S) and U(3S) events), the BaBar collaboration announced in July the observation of the ηb in the rare magnetic-dipole transition U(3S) → γηb. At the Nara workshop, BaBar showed preliminary evidence for the U(2S) → γηb decay, which confirms the earlier observation (figure 2). The measured mass for the ηb is 71.4 (stat.) ±2.7(syst.) MeV smaller than the U(1S) mass. This mass difference is almost twice the value calculated in perturbative QCD, 39±11(theor.) (δαs) MeV, hence challenging the expectation that non-perturbative corrections should be only a few million electron-volts.

The Belle collaboration reported an improved measurement of the inclusive cross-section for the production of a J/ψ meson plus additional charmed particles. The new result is around 15% lower than their previous value and, in combination with a new calculation of next-to-leading-order (NLO) corrections, brings theory and experiment into reasonable agreement – albeit with large uncertainties – potentially solving a long-standing quarkonium-production puzzle.

The workshop also heard about new calculations of NLO corrections to the colour-singlet quarkonium-production mechanism, which confirm that the ratio between the colour-singlet and colour-octet production rates is larger than previously thought. The same calculations predict that J/ψs produced via the colour-singlet mechanism should exhibit a stronger longitudinal polarization in the helicity frame than is observed in the data from CDF. In the case of J/ψ photoproduction, new NLO calculations of the colour-singlet contribution fail to reproduce the polarization measurements made at HERA. If it turns out that feed-down effects do not modify the observed polarizations significantly then these discrepancies might indicate that a colour-octet contribution is required to bring the polarization predictions and experiment into agreement. There was also a discussion on the consistency of the measurements of the J/ψ polarization by the E866, HERA-B and CDF experiments. The seemingly contradictory data sets are surprisingly well reproduced if one models the polarization along the direction of relative motion of the colliding partons by assuming that, for directly produced J/ψs, it changes continuously from fully longitudinal at low total momentum to fully transverse at asymptotically high total momentum.

Heavy ions

Another interesting line of research in the QWG’s activities has to do with the use of heavy-quarkonium states as particularly informative probes of the properties of the high-density QCD matter produced in high-energy heavy-ion collisions. Contrary to early expectations, however, currently available J/ψ suppression measurements cannot be seen as “smoking-gun signatures” that would show beyond reasonable doubt the creation of a deconfined state of quarks and gluons. Indeed, the present experimental picture is blurred by several “cold nuclear matter” effects, including shadowing of the parton densities, final-state nuclear absorption of fully formed charmonium states (or of pre-resonances) and initial-state parton-energy loss. Furthermore, there needs to be a careful evaluation of feed-down contributions to the production yields of the J/ψs (and their own “melting” patterns). Presentations in Nara showed recent progress in the understanding of these topics and there were detailed discussions concerning the quarkonium properties in finite-temperature QCD. Future measurements of the U family at the LHC should open a better window into this interesting landscape.

CCqua3_04_09

The next International Workshop on Heavy Quarkonia will take place at Fermilab in May 2010. Meanwhile, quarkonium aficionados are eagerly awaiting the first results from the LHC. More than 30 years after the serving of the charmonium and bottomonium families as revolutionary entrées, quarkonium physics remains high in the menus of many physicists, providing a table d’hôte where to test the properties of perturbative and non-perturbative QCD and validate the continually improving computational tools. Sprinkled with enough puzzles to spice up the meal, quarkonium physics will continue to please the most discerning appetites for years to come.

ATLAS makes a smooth changeover at the top

CCint1_04_09

If you think that it might be time to retire after more than 15 years of leading a constantly growing international collaboration and of constructing the world’s largest-volume particle detector, then Peter Jenni would disagree. Nicknamed the “father of ATLAS” by his colleagues, Jenni was there in 1992 when the ATLAS collaboration was born out of two early proto-collaborations. Initially co-spokesperson, he was spokesperson from 1995 until March 2009, when he handed over to Fabiola Gianotti. Now he looks forward to getting back to the main purpose of ATLAS: the physics.

“I am very proud to have helped the collaboration to construct ATLAS. Twenty years ago we could only imagine the experiment in our dreams and now it exists,” says Jenni. “I could lead the collaboration for so long because I was supported by very good ATLAS management teams where the right people, such as Fabiola Gianotti, Steinar Stapnes, Marzio Nessi and Markus Nordberg over the past five years, were in the right places.”

As with most particle-physics experiments, the management of one of the two largest detectors at the LHC is a challenge that changes during the lifetime of the collaboration: it starts with the design phase, continues with the R&D and the construction and ends up with the data-taking and analysis. “Over the years I tried to balance the emphasis given by the collaboration to the different aspects, that is, the hardware part (initially very strong), the data preparation, computing and software,” confirms Jenni.

Originally “only” about 800-strong, the ATLAS collaboration today has almost 3000 members from all over the world. “Keeping the groups united, inviting new groups to join the collaboration, negotiating to find the funds necessary for the construction… these have been among my key tasks during the past 15 years,” he explains. “My efforts also went into keeping groups whose technologies were not retained in the collaboration. Most of the time we managed to have everyone accept the best arguments, but unfortunately there were a few exceptions.”

With such a vast amount of experience, what does Jenni regard as the key element for managing a successful collaboration? “Talking with as many people as possible is a key factor,” he says. “ATLAS members, even the youngest ones, knew that I was available to discuss all problems or issues at any time. With the exception of the Christmas period, I have tried to reply to all e-mails within 24 hours. By the way, that is why my son thinks physics is crazy and decided to study microtechnologies instead!”

While Jenni’s functions have changed, his engagement with ATLAS definitely has not. “A significant part of my work remains the same, particularly in the relationships of ATLAS with the outside world. My main duty is to help obtain a smooth transition, which is facilitated by the fact that Fabiola was one of my two deputies – and I have enjoyed working with her before.” Indeed, having more freedom now, he can think of doing more than just sharing some management duties. “In the medium term I have the ambition to study physics with ATLAS,” he says. “I am already ‘selling’ LHC physics in many public talks but I would like to contribute some real physics myself.”

The ATLAS collaboration is clearly appreciative of its father’s dedication over the years. At the party organized in Jenni’s honour on 19 February, the Collaboration Board (CB) chairs directed by Katie McAlpine – the author and singer of the LHC rap – sang: “We’ve been CB chairs/and we’re here to affirm /Peter’s time was more an era/ than just a few terms/ leading ATLAS to completion/ like no one else can/ Of course he did it/ Jenni is the man.”

The changeover

Now with the construction complete, it’s Gianotti’s turn to fill the spokesperson’s many shoes, after Jenni passed her the leadership baton in March. At the very beginning she joined LHC R&D activities and then the proto-ATLAS collaboration in 1990. “Heading such an ambitious scientific project, and a large and geographically distributed collaboration, is certainly a big honour, responsibility and challenge,” she says. “However, I have inherited a very healthy situation from Peter: the experiment has already shown that it performs well, the collaboration is united and strong, and we can continue to prepare for the first collisions without any major worry.”

Indeed, activity on ATLAS hasn’t stopped since the LHC incident on 19 September 2008. “The first single beams that circulated in the machine before the incident were very useful for studying several aspects of the experiment, such as the timing of the trigger system. After the LHC stopped, we decided to focus on some repairs to the detector and on the optimization of the software and computing infrastructure, of the data distribution chain, and of the event simulation and reconstruction,” confirms Gianotti.

An effective distribution of data to the worldwide community is a key point for the new ATLAS spokesperson because she thinks that this is the prime requisite for a motivated and successful collaboration. “The crucial challenge for me is to make sure that each single member of ATLAS can participate effectively and successfully in the adventure that this experiment represents. ATLAS has a very exciting future ahead, with many possible discoveries that will change the landscape of high-energy physics. I consider it very important that each individual in this experiment can actively participate in the data analysis, regardless of whether he or she can physically be at CERN or not. In particular, we have to make sure the younger generations are nurtured in a stimulating environment, share the excitement for the wonderful physics opportunities and are given visibility and recognition,” she explains.

While the sharing of data relies mostly on the performance of the Grid and the software and computing infrastructure put in place by the collaboration, it cannot occur without the other side of the coin – effective and open communication in real-time with all members of the collaboration. “The solution we have envisaged is a web space where ATLAS people will be able to find updated ‘on-line’ news about the machine, the experiment, the physics results, anything that is relevant to ATLAS’ life,” explains Gianotti.

Asked about the potential “competition” among many people working on the same analysis, she says: “I think it is healthy that people from different groups work on the same topic with a collaborative and constructive spirit. This will allow us to produce solid, verified and fully understood results.” Regarding the relationship with CMS, the other general-purpose LHC experiment, she says, “There is a healthy competition, but also collaboration. For instance, ATLAS and CMS have set up a common group that works on statistics tools and how to combine the information coming from both experiments.”

The excitement about the restart of the LHC is growing again at CERN and around the world, and the experiments all have their own plans and strategies. “Before undertaking the path towards discoveries, we will need to understand the performance of our detector in all details and ‘rediscover’ the Standard Model,” says Gianotti. “I believe that we will be ready to start investigating new territories when we have observed top-quark production. Indeed, final states arising from the production of top quark–antiquark pairs contain most of the interesting physics objects, from leptons to missing energy and light- and heavy-flavour jets. In addition, this process is the main background to many searches for new physics. Being able to reconstruct these events successfully, and perform our first measurements of the top production cross-section and mass, will give us a clear indication that we are ready for discoveries.”

When does Gianotti expect ATLAS to release the first results? “It all depends on the performance of the machine – and its luminosity and energy profile. If everything goes well we expect to have first results, mainly addressing the detector performance, for the winter physics conferences early in 2010; then we hope to present the first interesting physics results at the summer conferences of the same year.”

The age of citizen cyberscience

I first met Rytis Slatkevicius in 2006, when he was 18. At the time, he had assembled the world’s largest database of prime numbers. He had done this by harnessing the spare processing power of computers belonging to thousands of prime-number enthusiasts, using the internet.

CCvie1_04_09

Today, Rytis is a mild-mannered MBA student by day and an avid prime-number sleuth by night. His project, called PrimeGrid, is tackling a host of numerical challenges, such as finding the longest arithmetic progression of prime numbers (the current record is 25). Professional mathematicians now eagerly collaborate with Rytis, to analyse the gems that his volunteers dig up. Yet he funds his project by selling PrimeGrid mugs and t-shirts. In short, Rytis and his online volunteers are a web-enabled version of a venerable tradition: they are citizen scientists.

There are nearly 100 science projects using such volunteer computing. Like PrimeGrid, most are based on an open-source software platform called BOINC. Many address topical themes, such as modelling climate change (ClimatePrediction.net), developing drugs for AIDS (FightAids@home), or simulating the spread of malaria (MalariaControl.net).

Fundamental science projects are also well represented. Einstein@Home analyses data from gravitational wave detectors, MilkyWay@Home simulates galactic evolution, and LHC@home studies accelerator beam dynamics. Each of these projects has easily attracted tens of thousands of volunteers.

Just what motivates people to participate in projects like these? One reason is community. BOINC provides enthusiastic volunteers with message boards to chat with each other, and share information about the science behind the project. This is strikingly similar to the sort of social networking that happens on websites such as Facebook, but with a scientific twist.

Another incentive is BOINC’s credit system, which measures how much processing each volunteer has done – turning the project into an online game where they can compete as individuals or in teams. Again, there are obvious analogies with popular online games such as Second Life.

Brains vs processors

A new wave of online science projects, which can be described as volunteer thinking, takes the idea of participative science to a higher level. A popular example is the project GalaxyZoo, where volunteers can classify images of galaxies from the Sloan Digital Sky Survey as either elliptical or spiral, via a simple web interface. In a matter of months, some 100,000 volunteers classified more than 1 million galaxies. People do this sort of pattern recognition more accurately than any computer algorithm. And by asking many volunteers to classify the same image, their statistical average proves to be more accurate than even a professional astronomer.

When I mentioned this project to a seasoned high-energy physicist, he remarked wistfully, “Ah, yes, reminds me of the scanning girls”. High-energy physics data analysis used to involve teams of young women manually analysing particle tracks. But these were salaried workers who required office space. Volunteer thinking expands this kind of assistance to millions of enthusiasts on the web at no cost.

Going one step farther in interactivity, the project Foldit is an online game that scores a player’s ability to fold a protein molecule into a minimal-energy structure. Through a nifty web interface, players can shake, wiggle and stretch different parts of the molecule. Again, people are often much faster at this task than computers, because of their aptitude to reason in three dimensions. And the best protein folders are usually teenage gaming enthusiasts rather than trained biochemists.

Who can benefit from this web-based boom in citizen science? In my view, scientists in the developing world stand to gain most by effectively plugging in to philanthropic resources: the computers and brains of supportive citizens, primarily those in industrialized countries with the necessary equipment and leisure time. A project called Africa@home, which I’ve been involved in, has trained dozens of African scientists to use BOINC. Some are already developing new volunteer-thinking projects, and a first African BOINC server is running at the University of Cape Town.

A new initiative called Asia@home was launched last month with a workshop at Academia Sinica in Taipei and a seminar at the Institute of High Energy Physics in Beijing, to drum up interest in that region. Asia represents an enormous potential, in terms of both the numbers of people with internet access (more Chinese are now online than Americans) and the high levels of education and interest in science.

To encourage such initiatives further, CERN, the United Nations Institute for Training and Research and the University of Geneva are planning to establish a Citizen Cyberscience Centre. This will help disseminate volunteer computing in the developing world and encourage new technical approaches. For example, as mobile phones become more powerful they, too, can surely be harnessed. There are about one billion internet connections on the planet and three billion mobile phones. That represents a huge opportunity for citizen science.

MAGIC becomes twice as good

CCnew1_03_09

Since the Major Atmospheric Gamma Imaging Cherenkov (MAGIC) telescope began operation in 2003 it has made a host of discoveries about sources of high-energy cosmic gamma rays. This is in large part thanks to having the largest reflector in the world, a tessellated mirror with an area of 240 m2. Now the MAGIC-I telescope is being joined by a sibling, MAGIC-II, which has similar characteristics together with several improvements.

Cosmic gamma rays are important because they serve as direct messengers of distant events, unaffected by magnetic fields. Studies usually rely on satellite experiments, while ground-based Cherenkov telescopes are mainly used for the highest energies. The latter make use of the fact that charged secondary particles, generated in electromagnetic showers in the atmosphere, may emit Cherenkov radiation – photons with energies that are in the visible and UV ranges. Such photons pass through the atmosphere and are observable on the surface of the Earth using sufficiently sensitive instruments.

The MAGIC telescopes on the Canary Island of La Palma are specifically designed to detect Cherenkov radiation resulting from electromagnetic air showers generated by cosmic gamma rays. MAGIC-I was built with emphasis on the best light collection, making gamma rays accessible down to an energy threshold of 25 GeV, which is lower than for any other existing ground-based gamma-ray detector. It now provides an ideal overlap with the Large Area Telescope on the recently launched Fermi Gamma-Ray Space Telescope, which has an energy range from 20 MeV up to 300 GeV.

The new MAGIC-II telescope has the same reflector size as MAGIC-I, although it is made with larger mirrors. Its camera has been modified to increase spatial resolution and sensitivity by hosting a larger number of photomultipliers of higher efficiency. It also incorporates a new read-out system. The new system’s gain in sensitivity will be between a factor of two and a factor of three compared with MAGIC-I, depending on energy. A lower energy threshold means a higher sensitivity to special phenomena – and a deeper view into the universe. It was characteristics such as these that in 2008 enabled MAGIC-I to discover the most distant very-high-energy gamma sources, including the supermassive black hole 3C279, at a record distance of 6000 million light-years – an observation that questions theories on the transparency of the universe to gamma rays (MAGIC Collaboration 2008a). In addition, MAGIC-I revealed pulsed gamma-rays above 25 GeV emanating from the Crab pulsar; this is the highest energy pulsed emission so far detected (MAGIC Collaboration 2008b).

The new MAGIC-II telescope will couple a low threshold with higher sensitivity and resolution, as well as improve further the view of the highest energy phenomena in the universe. A “first-light” ceremony is to take place on 24–25 April at the site on La Palma.

LHC consolidation work proceeds apace

The consolidation campaign for the LHC, which aims to ensure a safe final commissioning and reliable running of the collider is now well under way. On 9 February CERN’s management confirmed the restart schedule for the LHC resulting from the recommendations from the previous week’s Chamonix workshop.

After the incident in September 2008, magnets were immediately prepared to replace those damaged. Now consolidation work is underway to ensure a safe and reliable restart of the LHC later this year. the experiments have adequate data to carry out their first new-physics analyses and have results to announce in 2010. The new schedule also permits the possibility of lead–ion collisions in 2010.

In Chamonix there was consensus among all the technical specialists that the new schedule is tight but realistic. According to CERN’s director-general, Rolf Heuer, “The schedule we have now is without a doubt the best for the LHC and for the physicists waiting for data. It is cautious, ensuring that all the necessary work is done on the LHC before we start up, yet it allows physics research to begin this year.” This new schedule represents a delay of six weeks with respect to the previous schedule, which foresaw the LHC “cold at the beginning of July”. This delay arises from several factors such as the implementation of a new enhanced protection system for the busbar and magnet splices; installation of new pressure-relief valves to reduce the collateral damage in case of a repeat incident; application of more stringent safety constraints; and scheduling constraints associated with helium transfer and storage. The new pressure-relief system has been designed in two phases. The first phase involves the installation of relief valves on existing vacuum ports in the whole ring.

Calculations have shown that in an incident similar to that of 19 September 2008 – which damaged magnets in sector 3-4 – the collateral damage would be minor with this first phase. The second phase involves adding additional relief valves on all of the dipole magnets, which would guarantee minor collateral damage (to the interconnects and super-insulation) in all worst cases over the life of the LHC. The management has decided for 2009 to install the additional relief valves on four of the LHC’s eight sectors, concurrent with repairs in the sector 3-4 and other consolidation work already foreseen. The dipoles in the remaining four sectors will be equipped in 2010.

On 18 February, Steve Myers, Director for Accelerators and Technology, reviewed the discussions on the LHC that took place at Chamonix at the public session of the LHC experiments committee. In particular, he described the scenarios that were studied to implement the consolidation measures and resume operation. He also explained that the schedule ultimately adopted will make it possible to obtain more physics data sooner, even though the energy will be limited during this first period to 5 TeV per beam to ensure completely safe operation. During the last week of February the enhanced quench protection system had a full review from a panel made up of experts from other high-energy physics laboratories from around the world, including the Brookhaven National Laboratory, DESY, Fermilab and the international fusion project, ITER. The enhanced protection system measures the electrical resistance in the cable joints (splices) and is much more sensitive than the system existing on 19 September.

The system has two separate parts: one to detect and protect against splices with abnormally high resistance; the second to detect a symmetric quench. The planning schedule was reviewed to define priorities between these two parts, both of which need to be complete before the restart at the end of September. The review also covered areas such as the technical details of the implementation of the new system, how well it will perform during operation and how “robust” it will be after years of service. In the preliminary report the panel found that: “The machine-protection staff have demonstrated a deep understanding of the issues involved in the design of the high-resistance-splice detection system.” It has “full confidence that the new system will have the ability to give early warnings for suspicious splices measured at the level of 1 nΩ” and that “the symmetric quench protection system, once its design is complete, will be able to detect quenches at twice the normal detection level”.

PAMELA pins down cosmic antiproton flux

CCnew3_03_09

The satellite experiment Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) has made a new measurement of the antiproton-to-proton flux ratio in cosmic rays with energies up to 100 GeV. The results, which represent a great improvement in statistics compared with data published previously, provide significant constraints on exotic sources of cosmic antimatter.

The PAMELA experiment has been in low Earth-orbit on the Resurs-DK1 satellite since its launch in June 2006. During 500 days of data collection it has identified 1000 antiprotons with energies in the range 1–100 GeV, including 100 antiprotons with an energy above 20 GeV. This is a larger data sample at higher energies than any other experiment has obtained.

Cosmic antiprotons can be made in particle (mainly proton) collisions with interstellar gas but they could also have more exotic origins, for example, in the annihilation of dark-matter particles. Finding out more about the actual production mechanisms requires detailed studies of the antiproton energy spectrum over a wide energy range, which in turn depend on data with good statistics, as PAMELA now provides.

Analysis of the data from PAMELA show that the antiproton-to-proton flux ratio rises smoothly to about 10 GeV, before tending to level off. The results match well with theoretical calculations that assume only secondary production of antiprotons by cosmic rays propagating through the galaxy. This places limits on contributions from other, more exotic sources.

ISOLTRAP weighs in with new noble results

CCnew4_03_09

Georg Christoph Lichtenberg , the 18th-century philosopher scientist, said: “To see something new, you must build something new.” This adage certainly applies on the nuclear scale at CERN’s On-Line Isotope Mass Separator, ISOLDE, the pioneering rare-isotope factory. Measurements with the Penning-trap mass spectrometer ISOLTRAP, have determined new masses for several isotopes of the noble gases, xenon and radon, while discovering a new isotope of radon along the way.

ISOLDE is CERN’s longest-running facility and has always been at the forefront of development. Now the facility is the key player in the European sixth framework design study for EURISOL, a next-generation facility for isotope separation online (ISOL). At ISOLDE the short-lived nuclides are created using 1.4 GeV protons from CERN’s PS Booster. Once produced in the target, these rare species must be ionized efficiently to form secondary beams that can be accelerated and mass-separated for use in experiments. Thus, all ISOLDE targets have a built-in chemically selective ion source.

CCnew5_03_09

One of the tasks of EURISOL (in conjunction with the HighInt Marie-Curie Training programme) is the development of an efficient ion source that can accommodate the 50-fold increase in proton beam intensity that will become available at CERN through the upcoming Linac 4 and Superconducting Proton Linac upgrades. This has led to a prototype, the Versatile Arc Discharge Ion Source (VADIS). Its principle rests on the optimization of both the discharge-current densities within the ion-source geometry and the extracted ion-beam intensities (figure 1). The version designed for the selective ionization of noble gases increases ionization efficiency by over an order of magnitude.

VADIS was employed at ISOLDE in 2008 with spectacular results. The experiment in question involved another pioneering facility, ISOLTRAP. ISOLTRAP in effect weighs radioactive nuclides created by ISOLDE using the elegant technique of exciting the cyclotron motion of a single ion in a magnetic field. Knowledge of the mass gives access to the nuclear binding energy, which is not only a rich source of information for nuclear structure, size and shape, but also determines the amount of energy available for radioactive decay and for reactions of major importance for modelling nucleosynthesis, the cooking of elements in stars.

ISOLTRAP first weighed isotopes of xenon ionized by VADIS, determining masses for four more of them. The team then focused its efforts on the neutron-rich isotopes of radon, with impressive results. The experiment determined seven new masses, one for an isotope, 229Rn, that had never previously been observed in the laboratory. As there was no information to confirm this isotope’s identity, the experimenters needed to take particular care to make sure it was indeed what they thought it to be. As a result, they also determined the half live of this nuclide (figure 2), marking the first discovery of a nuclide by Penning-trap mass spectrometry (Neidherr et al.). To make things even more interesting, the new radon masses show a unique pattern that provides a link to a special type of nuclear octupole deformation, predicted to occur in this region of the nuclear chart.

Finding the way to polarized antiprotons

CCnew6_03_09

The QCD physics potential of experiments with high-energy polarized antiprotons is enormous, but until now high-luminosity experiments have been impossible. This situation would change dramatically with the production of a stored beam of polarized antiprotons, and the realization of a double-polarized high-luminosity antiproton–proton collider. Recent measurements at the Cooler Synchrotron (COSY) at Jülich have for the first time studied the influence of unpolarized electrons on polarized protons, settling a puzzle over the magnitude of such effects.

The collaboration for Polarized Antiproton Experiments (PAX) has proposed a physics programme that would be possible with a double-polarized proton–antiproton collider at the new Facility for Antiproton and Ion Research (FAIR), which is to be built at GSI in Darmstadt (PAX Collaboration 2006). The original idea was to use polarized electrons to produce a polarized beam of antiprotons (Rathmann et al. 2005). This triggered further theoretical work on the subject and a group from Mainz proposed using co-moving electrons or positrons (e) at slightly different velocities from the orbiting protons or antiprotons (p) as a means to polarize the stored beam (Walcher et al. 2007). When the relative velocities, v, between the e and p are adjusted so that v/c is about 0.002, a numerical calculation by the Mainz group predicts the cross-section for the ep spin-flip to be as large as about 2 × 1013 b. Analytical predictions for the same quantity by a group from Novosibirsk, however, yield a range well below one millibarn (Milstein et al. 2008).

CCnew7_03_09

To provide an experimental answer to the puzzle, the collaborations for PAX and for the Apparatus for Studies of Nucleon and Kaon Ejectiles experiment joined forces at COSY, where they mounted an experiment that used the electrons in the electron cooler as a target and measured the effect of the electrons on the polarization of a 49.3 MeV proton beam orbiting in COSY. Instead of studying the build-up of polarization in an unpolarized beam, the teams studied the inverse by observing the depolarization of an initially (vertically) polarized beam; they measured the proton-beam polarization using the analysing power of proton–deuteron elastic scattering on a deuterium cluster jet target (figure 1).

Figure 2 shows the results, with the ratio of the measured beam polarizations, PE and P0 (Oellers et al. 2009). PE represents the measured polarization corresponding to well defined changes of the electron velocity with respect to the protons. This was achieved by detuning the accelerating voltage in the electron cooler by a specific amount compared to the nominal voltage. P0 is the polarization measured when the electron beam was off (i.e. no electron target was present). No depolarization effect on the proton beam could be detected within the statistical precision of the measurement. This translates into an upper limit for the ep transverse and longitudinal spin-flip cross-sections of 1.5 × 107 b at a relative velocity of v = 0.002, six orders of magnitude below the numerical predictions. After the completion of the experiment, the Mainz group uncovered a numerical overestimation in their original estimates (Walcher et al. 2009).

The result rules out the practical use of polarized leptons to polarize a beam of antiprotons with present-day technologies. This leaves spin-filtering as the only proven method to polarize a stored beam in situ, a technique that exploits the spin- dependence of the strong interaction using a polarized internal target (Rathmann et al. 1993). At present, a complete quantitative understanding of all underlying processes is lacking, so the PAX collaboration aims to use stored protons in COSY for high-precision polarization build-up studies with transverse and longitudinal polarization. Under these circumstances, the build-up process itself can be studied in detail because the spin-dependence of the proton–proton interaction around 50 MeV is completely known. The internal polarized target and the target polarimeter required for these investigations are currently set up to be installed together with a large-acceptance detector system for the determination of the beam polarization in a dedicated low-β section at COSY.

In contrast to the proton–proton system, the experimental basis for predicting the polarization build-up by spin filtering in a stored antiproton beam is practically non-existent. Therefore, it is of high priority to perform a series of dedicated spin- filtering experiments using stored antiprotons. The Antiproton Decelerator at CERN is a unique facility at which stored antiprotons in the appropriate energy range are available with characteristics that meet the requirements for the first-ever antiproton polarization build-up studies.

CDF and D0 report single top quark events

CCnew8_03_09

Almost 14 years to the day after the announcement of the discovery of the top quark in 1995, the CDF and D0 collaborations at Fermilab have announced the observation of top quarks produced singly in proton–antiproton collisions, rather than in top antitop pairs. On 4 March, the two teams submitted their independent results to Physical Review Letters. Unlike pair-production of top quarks, which occurs through the strong interaction, the production of single top quarks occurs through the weak interaction and has important implications for possible new physics beyond the Standard Model.

Only one in every 20,000 million proton–antiproton collisions produces a single top quark, and to make matters worse, the signal of these rare occurrences is easily mimicked by other “background” processes that occur at much higher rates. Both teams have previously published evidence for single top production at Fermilab’s Tevatron, CDF last year and D0 in 2007. These earlier papers reported significance levels of 3.7 σ and 3.6 σ for CDF and D0, respectively. Now both teams report the first observation of the process with a significance of 5.0 σ, based on 3.2 fb–1 of proton–antiproton collision data in CDF and 2.3 fb–1 in D0.

CCnew9_03_09

Examples of single top quark candidates in D0 (see other image) and CDF. In both events the top quark decays and produces a b quark jet, a muon and a neutrino. In the CDF event (this image), the arrow indicates the direction of the escaping neutrino.
Image credit: D0 and CDF.

The analyses also constrain the magnitude of |Vtb|, an important parameter of the Standard Model’s Cabibbo-Kobayashi Maskawa (CKM) matrix, which describes how quarks can change from one type to another. If the CKM matrix describes the intermixing of only three generations of quarks – with top and bottom forming the third generation – the value of |Vtb| should be close to one. In the new analysis CDF finds |Vtb| = 0.91 ± 0.11(stat.+syst.) ± 0.07(theor.), while D0 reports |VtbfL| = 1.07±0.12 where fis the strength of the left-handed coupling between the W boson and the top and bottom quarks.

In addition to its inherent success, discovering single top quark production has presented the collaborations with challenges similar to the search for the Higgs boson, in terms extracting an extremely small signal from a large background. Advanced analysis techniques pioneered for the single top discovery are now in use in both collaborations for the Higgs boson search.

Fermi sees most powerful gamma-ray burst

CCast1_03_09

The Fermi Gamma-ray Space Telescope has observed the evolution of a gamma-ray burst over six orders of magnitude in photon energy. The combination of its brightness and its remote distance makes it by far the most energetic gamma-ray blast ever seen. Furthermore, the observed delay of the highest-energy emission gives a lower limit on the strength of quantum-gravity effects.

Since the launch of the Swift satellite in November 2004, up to a few gamma-ray bursts (GRBs) are routinely detected every day. The phenomenon now seems commonplace and only the record-breaking bursts attract public attention.

After the “Rosetta stone” GRB 030329 and the “naked-eye” GRB 080319B, here comes the “extreme” GRB 080916C. This giant burst was observed by Fermi, which was launched into space last year. It is one of the rare bursts detected up to giga-electron-volt energies by the Large Area Telescope (LAT), the main instrument aboard Fermi. In five months the LAT has detected only 3 GRBs out of 58 that were in its field of view, according to the positions provided by the secondary instrument, the Gamma-ray Burst Monitor (GBM).

The burst of 16 September 2008, GRB 080916C, was the brightest observed so far and the only one with a distance determined by an observed redshift. The redshift of z = 4.35 ± 0.15, measured by the Gamma-Ray Burst Optical/Near-Infrared Detector (GROUND) on the 2.2 m Max Planck Telescope at La Silla, in Chile, locates the collapsing-star event at a distance of 12.2 thousand million light-years. This cosmological distance means that GRB 080916C was intrinsically extremely luminous – at least twice as much as the previous record-holder, GRB 990123, which was observed by the Energetic Gamma-Ray Experiment Telescope aboard the Compton Gamma-Ray Observatory.

The Fermi LAT and Fermi GBM collaborations have jointly published a detailed analysis of the emission of this extreme burst. The combined GBM and LAT spectra – covering the range from 8 keV to 300 GeV – are consistent with a very simple spectral shape. Spectra were extracted for five distinct epochs during the evolution of the burst and all have the simple form of a Band function, which smoothly joins low- and high-energy power laws. A simple physical interpretation for such spectra is synchrotron radiation of charged particles in a magnetic field, but this cannot be confirmed, because the synchrotron self-Compton emission expected in this case could not be detected.

The most interesting result is probably the evidence of a consistently increasing delay of higher-energy radiation during the second peak of the GRB emission. This time lag can be intrinsic to the source or induced by quantum-gravity effects along the path from the remote source to the telescope. The delay by about 16 s of the most energetic photon – 13 GeV – with respect to the on-set of the burst allows the researchers to derive a lower limit on the quantum-gravity mass of only about one order of magnitude below the Planck mass. The question of whether the observed delay is intrinsic to the source or results from its long journey through the quantum foam of space–time will eventually be solved with the detection of several other bursts with known redshift and measurable time delays.

bright-rec iop pub iop-science physcis connect