Comsol -leaderboard other pages

Topics

LHCb’s results become more precise

By the time that the first long run of the LHC ended early in 2013, the LHCb experiment had collected data for proton–proton collisions corresponding to an integrated luminosity of 2 fb–1 at 8 TeV, to add to the 1 fb–1 of data collected at 7 TeV in 2011. The first batch of data allowed the LHCb collaboration to announce a variety of results, many of which have now been updated using the larger data sample and/or by including different decay channels. At the 2014 Rencontres de Moriond conference in March, the collaboration presented more precise results from a number of different analyses.

CCnew6_04_14

The flavour-changing neutral-current decay B → K*μ+μ is an important channel in the search for new physics because it is highly suppressed in the Standard Model. While there are relatively large theoretical uncertainties in the predictions, these can be overcome by measuring asymmetries in which the uncertainties cancel. One of these is the isospin asymmetry, based on the differences in the results of measurements of B0 → K*μ+μ and B+ → K*+μ+μ. The Standard Model predicts this isospin asymmetry to be small, which LHCb confirmed in 2011, based on 1 fb–1 of data. On the other hand, a similar analysis for decays in which the excited K* is replaced by its ground state K, showed evidence for a possible isospin asymmetry.

Now, the analysis of the full 3 fb–1 of data, which was presented at the Moriond conference, gives results that are consistent with the small asymmetry predicted by the Standard Model in both the K* and K cases. However, even if this confirms that the difference between B0 and B+ decays is small for this channel, there is a tendency for the differential branching fractions to have lower values than the theoretical predictions, as the figures show.

Another interesting result that LHCb has now refined concerned the exotic state X(3872), which was discovered by the Belle experiment at KEK in 2003. The nature of the X(3872) is puzzling because although it appears charmonium-like, it does not fit in to the expected charmonium spectrum. Exotic interpretations include the possibility that it could be a DD* molecule or a tetraquark state.

With the data from 2011, LHCb unambiguously determined its quantum numbers JPC as 1++. At Moriond the collaboration went further by presenting a measurement of the ratio of the branching fractions for the decay of the X(3872) into ψ(2S)γ and J/ψγ. This ratio, Rψγ, is predicted to be different depending on the nature of the X(3872). LHCb finds Rψγ = 2.46±0.64±0.29, which is compatible with other experiments but more precise. This value does not support the interpretation as a pure DD* molecule.

ATLAS uses t → qH decays to pin down the Higgs

Since the observation of a Higgs boson at a mass around 125.5 GeV by ATLAS and CMS in July 2012, both collaborations are making every effort to pin it down and decide if it is indeed the Higgs boson of the Standard Model, or the first member of a somewhat larger family, as predicted by several models that go beyond the Standard Model. Working in this direction, ATLAS used the six million tt pairs produced in Run I of the LHC to look for the possible decay of a top quark or antiquark into a light quark (up or charm) and a Higgs boson, t → qH.

CCnew8_04_14

In the Standard Model such decays, which proceed via flavour-changing neutral currents, are highly suppressed, but in more complex models they might be present, albeit with a small branching ratio compared with the dominant t → bW decay. Doing the search using the dominant decay mode of the Higgs boson (H → bb) would lead to final states that are very hard to distinguish from the majority of tt decays. Therefore ATLAS made the choice to use the H → γγ decay mode – which has a clean signature of two photons with high transverse-momentm (pT) clustering as a narrow peak in invariant mass around 125.5 GeV – the power of this decay mode being demonstrated by the Higgs-boson discovery. Unfortunately the use of this decay mode is hampered by a small branching fraction, only 0.23%. Putting numbers together, and taking into account the acceptance of the detector and of the selection, a branching ratio B of 1% for t → qH would lead to about 11 observed events in a topology with two high pT photons and four jets, of which one would be identified as a b-jet. In addition, about three events with two high-pT photons, two jets, a lepton and missing transverse momentum (from the leptonic decay of the W) would also be expected.

After making kinematical cuts to ensure the compatibility of the selected events with the tt final state, ATLAS obtained the diphoton mass-spectrum shown in the figure. This rules out B = 1% immediately because it is clear that there is not an 11-event signal at 125.5 GeV. A detailed statistical analysis gives an expected limit on B of 0.53%. The small, non-significant excess in the 124–128 GeV bin worsens the observed limit to 0.79%, at the 95% confidence level.

This is the first experimental result on this channel and its precision is limited, mainly by the available statistics. When data become available at 13/14 TeV – leading to an increase of the tt- production cross-section of almost a factor of four – and with a larger integrated luminosity, either a much tighter limit will be obtained or, perhaps, a significant signal will show up, giving evidence for physics beyond the Standard Model in the Higgs sector.

LHCf investigates proton–lead collisions

The final run of the LHC in January 2013 prior to the start of the current long shutdown provided collisions between a beam of protons and a beam of lead ions, allowing the LHCf experiment to make further studies related to the interactions of cosmic rays in the Earth’s atmosphere. In particular, the collaboration was able to measure the distribution in transverse momentum (pT) for the inclusive production of neutral pions in the very forward region.

CCnew10_04_14

Despite several experimental indications at the HERA electron–proton collider at DESY, it is still not well understood how the density of partons (quarks and gluons) in a proton target increases or even saturates when Bjorken-x in the target – essentially the fraction of the proton’s momentum – is extremely small. Such phenomena are known to be visible in events at large rapidities – that is, close to the beam direction. Furthermore, in the case of nuclear targets, the parton density in the target is expected to be larger by about A1/3, where A is the nuclear mass number. In hadronic interactions, partons in the projectile hadron would lose their energy while travelling in the dense QCD-governed matter of the nuclear target, and particle production mechanisms would change accordingly when compared with those in nucleon–nucleon interactions.

The LHCf detector is designed to measure the hadronic production cross-sections of neutral particles emitted at angles close to the beam direction – the “very forward” region – in proton–proton (pp) and proton–lead (pPb) collisions at the LHC. The detector covers a pseudorapidity range larger than 8.4 and is capable of precise measurements of the forward high-energy inclusive-particle-production cross-sections of neutral particles. Now, the collaboration has analysed the data taken in January 2013 on pPb collisions at nucleon–nucleon centre-of-mass energies of √sNN = 5.02 TeV and a beam-crossing angle of 145 μrad, for an integrated luminosity of 0.63 nb−1.

To obtain the soft-QCD component of the forward pion production, which is sensitive to the parton density in target, unavoidable contamination from ultra-peripheral collisions was first calculated using Monte Carlo simulations and then subtracted from the measured pT spectra. Once the ultra-peripheral collisions have been taken into account, the pT spectum measured by LHCf in the rapidity range −11.0 < ylab < −8.9 and 0 < pT < 0.6 GeV (in the detector reference frame) indicates a strong suppression of the production of neutral pions. This leads to a value of the nuclear modification factor value, RpPb, relative to the interpolated pT spectra in pp collisions at √s = 5.02 TeV, of about 0.1–0.4 – a value that is in overall agreement with the predictions of several Monte Carlo simulations of hadronic interactions.

OPERA sees a fourth τ neutrino

CCnew11_04_14

The OPERA experiment at the INFN Gran Sasso Laboratory has detected a fourth example of neutrino oscillation, with a muon neutrino (νμ) produced at CERN detected as a τ neutrino (ντ) after travelling a distance of 730 km.

The international OPERA experiment, which involves 140 physicists from 28 research institutes in 11 countries, was designed to observe this exceptionally rare phenomenon, gathering data in the neutrino beam produced by the CERN Neutrinos to Gran Sasso (CNGS) project. Generated by decays of pions and kaons made in the interactions of a proton beam from the Super Proton Synchrotron with a graphite target, the beam consisted mainly of νμ that would pass unhindered through the Earth’s crust towards Gran Sasso. The appearance and subsequent decay of a τ lepton in the OPERA experiment provides the telltale sign of νμ to ντ oscillation through a charged-current interaction.

After the first neutrinos arrived at the Gran Sasso Laboratory in 2006, the experiment gathered data for five consecutive years, from 2008 to 2012, during which the CNGS beam delivered a total of 17.97 × 1019 protons on target, yielding 19,500 neutrino events in the detector. The first ντ was observed in 2010, the second and third ones in 2012 and 2013, respectively.

The detection of the fourth ντ is important confirmation of the events seen previously. It means that the νμ to ντ transition has been seen for the first time with a statistical significance exceeding the 4σ level, so that OPERA can now claim the observation of this extremely rare phenomenon. The collaboration will continue to search for ντ in the data that remain to be analysed.

BICEP2 finds evidence of cosmic inflation

The news came as a surprise on 17 March, making a “big bang” in the physics community. Within hours of the announcement, physicists around the world had become aware of the existence of the Background Imaging of Cosmic Extragalactic Polarization (BICEP2) telescope at the South Pole, and were hypnotized by the figure showing the swirling B-mode polarization of the cosmic microwave background (CMB), and by the profound implications of the discovery. The observations not only provide the first direct evidence for inflation, but also determine its energy scale and bear witness to a quantum-gravitational process.

The idea of cosmic inflation was originally proposed in 1980 by Alan Guth, then at Cornell University, to solve several cosmological problems identified in the 1970s. In this scenario, the inflationary epoch is an extremely brief period just after the Big Bang lasting a mere 10–32 s. During this minuscule fraction of a second, the universe would have expanded at superluminal speed by a factor of at least 1025. Inflation would result from a hypothetical inflaton field acting as a cosmological constant to produce an accelerated expansion of the universe. The inflation ends with the decay of the inflatons into Standard Model particles.

During inflation, quantum fluctuations of the inflaton field would be stretched and amplified to produce the density fluctuations of the CMB observed by the Wilkinson Microwave Anisotropy Probe and Planck satellites (CERN Courier May 2006 p12, May 2008 p8, May 2013 p12). Theorists have speculated further that quantum fluctuations of the space–time metric would also be “frozen in” by inflation, producing characteristic gravitational waves. From inflation to the recombination epoch – when electrons combine with protons to form hydrogen atoms – there would have been 380,000 years during which photons would scatter off electrons and become polarized. The net polarization of the CMB therefore reflects inhomogeneities in the hot plasma of the early universe.

Whereas both density and metric fluctuations can produce a gradient field in the sky – the so-called E-mode polarization – only metric fluctuations can produce the curl component of the polarization, the so-called B mode. Although there are foreground contaminations that can produce B modes at lower angular scales, finding B-mode polarization on the scale of a few degrees implies the presence of primordial gravitational waves.

It is this type of polarization that BICEP2 has discovered in the CMB. The signal is more than 100 times weaker than the intensity fluctuations of the CMB, which explains why these tiny variations at the 0.1 μK level have not been detected earlier. To achieve this precision, the BICEP2 experiment is equipped with 512 detectors cooled down to 0.27 K and installed at the South Pole. At an altitude of more than 3000 m, the site provides the closest conditions to space with cold, dry, stable air.

The strong B-mode signal found by the BICEP2 experiment corresponds to a tensor-to-scalar ratio r = 0.20 + 0.07 –0.05, with r = 0 disfavoured at the 6σ level. This is very good news for theorists who feared a much lower value of r, which would prevent the detection of B modes. The high-value of r implies an energy scale for inflation of around 2 × 1016 GeV. The BICEP2 measurements therefore offer a glimpse at physics at an energy approaching the Planck scale, where all of the fundamental forces are thought to be unified. This explains the burst of nearly 100 new publications citing the BICEP2 paper that had appeared by the end of March. The Planck satellite and other facilities now have the challenge of confirming these exciting results.

BICEP2 is the second stage of a co-ordinated programme with the BICEP and Keck Array experiments. The four principal investigators are John Kovac (Harvard/CfA), Clem Pryke (University of Minnesota), Jamie Bock (Caltech/JPL), and Chao-Lin Kuo (Stanford/SLAC).

BESIII and the XYZ mystery

BESIII

BESIII is the latest incarnation of an experimental programme that began in 1989 when the Beijing Electron–Positron Collider (BEPC) and the Beijing Spectrometer (BES) detector started operation at the Institute of High Energy Physics (IHEP). The focus is on the physics of charm and the τ lepton, which are accessible at the centre-of-mass energies of BEPC. The BES programme is the only one in the world to focus entirely on this area of particle physics through the collection of record numbers of J/ψ, ψ´, D and τ particles. During the past two decades, thanks to the luminosity available first at BEPC and then at BEPCII, the BES collaboration has made many important, high-precision measurements. More recently, this has led to investigations of new particles – the XYZ particles – that appear not to fit in with the standard picture of charmonium states.

One of the first major contributions of the BES programme came in 1992, when the collaboration made a much more precise measurement of the mass of the τ lepton and cleared up a big disagreement between the particle’s mass, its lifetime and its branching ratio to electrons – quantities that are related by the Standard Model. Then from 1993 to 1997, BEPC and BES were upgraded. BES became BESII and received a new main drift chamber (MDC) and time-of-flight (TOF) system. The collaboration soon embarked on a scan of the ratio of hadron to muon-pair production, which measured the hadronic cross-section at 93 energy points in the range 2–5 GeV and improved the precision in this region from 15–20% to less than 6%.

CCbes2_04_14

These cross-section results, together with many different measurements from Fermilab, CERN’s Large Electron–Positron collider and the LHC, are used in stringent tests of the Standard Model. The cross-section measurements are required to determine the value of the fine structure constant, αQED – which is not constant – at the mass of the Z boson, αQED(MZ). The new cross-section measurements shifted the value of αQED(MZ) and also moved the mass of the Higgs boson predicted by the Standard Model to be more in line with the measured lower limits on the mass at that time. BES and BESII also produced many other results on J/ψ and ψ´ hadronic decays, ψ´ transitions, and D and Ds decays.

The upgrade of BEPC to BEPCII began in 2004 and finished in 2008. The facility became a two-ring collider with 93 beam bunches in each ring, superconducting micro-β focusing quadrupole magnets, superconducting RF, and a design luminosity of 1 × 1033 cm–2 s–1. At the same time, a brand new detector – BESIII – was constructed with a small-celled, helium-based MDC, a new TOF system, a CsI(Tl) electromagnetic calorimeter, a resistive-plate-chamber muon identifier and a 1 T superconducting solenoidal magnet.

In the first year of operation, 2009, BESIII accumulated 106 million ψ´ events and 226 million J/ψ events. With the ψ´ data, BESIII was able to observe clearly the process ψ´ → π0hc followed by hc → γηc and measure for the first time the individual branching ratios, which allowed comparison with theoretical predictions.

Later, BESIII measured the mass and width of the ηc, taking into consideration for the first time interference between the resonance and the non-resonant background. Previously, the CLEO collaboration had pointed out that the masses and widths of the ηc were different when measured in ψ´ radiative decay and measured in proton–antiproton or two-photon production. Including the interference effect produced results that were consistent with the latter, and the most precise measurements to date. Moreover, BESIII was able to observe for the first time the M1 transition ψ´ → γηc(2S) and measure the mass and width of the ηc(2S) and the branching fraction for this process. With the J/ψ data, BESIII confirmed the X(1835) seen by BESII and observed two new resonances, the X(2120) and the X(2370), in the process J/ψ → γπ+πη´.

CCbes3_04_14

In the following years, BESIII accumulated another 1000 million J/ψ events, 400 million ψ´ events, and approximately 3 fb–1 of data at the ψ(3770) resonance. The ψ(3770) decays more than 90% of the time to quantum-correlated DD pairs, which allow measurement of absolute branching ratios, as well as of DD mixing. The collaboration recently made the most precise determination of the branching ratio of D → μν, which allows determination of the pseudo-scalar decay constant, fD+, using the world-average value of the Cabibbo-Kobayashi-Maskawa matrix element |Vcd|, or determination of Vcd using the lattice QCD value of fD+. The energy region of τ and charm is extremely rich in the variety of physics topics available and BESIII is accumulating world-class data sets to study them.

XYZ physics

The X(3872) was discovered in the decay of B mesons at KEK by the Belle experiment in 2003. This was the first member of a family of exotic particles that do not agree with the predicted masses of charmonium particles in this mass region and decay in a peculiar way. Rather than decaying as expected into a pair of particles with open charm, such as a D meson and its antiparticle D, they decay into π+πJ/ψ. In 2005, the BaBar experiment at SLAC discovered the Y(4260) in initial-state radiation (ISR) production, where much of the electron or positron energy is radiated away leaving the energy remaining at 4260 MeV. Like the X(3872), the Y(4260) has a mass that does not agree with those expected for charmonium and also decays to π+πJ/ψ.

The X(3872) and Y(4260) are members of the XYZ family of particles, which now contains numerous members, although many of them are not yet confirmed. The discovery of the particles, which do not fit into the standard picture, has sparked a great deal of theoretical interest and many theoretical papers.

In December 2012, BESIII jumped into the world of XYZ physics by beginning to take data at 4.26 GeV – the energy of the Y(4260). Running at this energy has the advantage that Y(4260) events might be produced directly rather than indirectly by B decay or ISR production, both of which have a much smaller cross-section.

Analysing the accumulated sample after one month of data taking, the collaboration found 1477 e+e → π+πJ/ψ, J/ψ → l+l events – where l is an electron or a muon – and obtained a cross-section consistent with Y(4260) production (Ablikim et al. 2013a). The π±J/ψ mass distribution, shown in figure 1, revealed an unexpected structure that was named the Zc(3900). The mass and width of the Zc(3900) are 3899.0±3.6±4.9 MeV/c2 and 46±10±20 MeV, respectively. The decay contains both charmonium – the J/ψ – and a charged pion, suggesting that the Zc(3900) contains four quarks. The discovery was quickly confirmed by the Belle collaboration and by an analysis of CLEO data. Other charged charmonium-like particles had been found earlier by Belle but never confirmed, so this is the first confirmed Z state.

Data taking continued through to June 2013 at 13 energies between 3.9 and 4.4 GeV, bringing the total luminosity to approximately 2.5 fb–1, and the analysis of four other processes has now been completed. The first is e+e → π+πhc, where hc → γηc and ηc decays to 16 exclusive hadronic states (Ablikim et al. 2013b). This is similar to the previous analysis with the J/ψ replaced by the hc – another charmonium particle. Here again the π±hc mass distribution reveals a narrow structure, named the Zc(4020), as shown in figure 2. The mass and width of the Zc(4020) are 4022.9±0.8±2.7 MeV/c2 and 7.9±2.7±2.6 MeV, respectively. No significant Zc(3900) is seen in this process.

CCbes5_04_14

The second process analysed is e+e → π±(D*D*), where a partial reconstruction technique is used that requires the identification of the π±, a charged D from the decay of a charged D*, and one π0 from either the D* or the D* decay (Ablikim et al. 2014a). The analysis is based on 827 pb–1 of data at 4.26 GeV. When the mass recoiling from the π± is plotted, an enhancement is seen, as shown in figure 3, so the process is interpreted as e+e → π±Zc(4025), Zc 4025) → (D*D*), where the mass and width of the Zc(4025) are 4026.3±2.6±3.7 MeV/c2 and 24.8±5.6±7.7 MeV, respectively.

The third process is e+e → π±(DD*), where again a partial reconstruction technique is used, requiring that the π± and a D be identified (Ablikim et al. 2014b). The analysis is based on 525 pb–1 of data at 4.26 GeV. When the mass of the (DD*) is plotted an enhancement is seen, as shown in figure 4, so the process is interpreted as e+e → π±Zc(3885), Zc(3885)→ (DD*), where the mass and width of the Zc(3885) are 3883.9±1.5±4.2 MeV/c2 and 24.8±3.3±11.0 MeV, respectively. The data prefer that the Zc(3885) has spin-parity JP = 1+.

Some of the Zc states described above might be the same state. Interference has been neglected in the fitting of the peaks, and it could shift the masses and widths obtained. However, there are probably at least two separate Zc states.

CCbes6_04_14

So far the X(3872) has been seen in B decays and hadron collisions only, but its quantum numbers are such that it should be able to be produced in radiative decays of the Y(4260). Figure 5 shows the π+πJ/ψ mass distribution for e+e → γπ+πJ/ψ events from the combined data at 4.009, 4.229, 4.26 and 4.36 GeV (Ablikim et al. 2014c). The clear peak has a mass of 3872.1±0.8±0.3 MeV, to be compared with the mass m(X(3872)) = 3871.68±0.17 MeV listed in the Particle Data Group tables. Although the events could be produced directly, it is highly plausible that the X(3872) is from radiative decay of the Y(4260).

There are many possible theoretical explanations for the XYZ particles, including the Y(4260) and the recently discovered Zc structures observed by BESIII. They include four-quark models with molecular states comprising charm and anti-charm particles, tetraquark states, and hadro-charmonium, as well as hybrid states (charmonium states with an extra gluon) and a model of initial single-pion emission. More experimental results are necessary to check the predictions of the various models and to decide which ones, if any, describe the physics correctly.

BESIII entered the era of XYZ physics by acquiring about 2.5 fb–1 of data at around 4.26 and 4.36 GeV. Currently, more data are being acquired and many other analyses of the data collected so far are in progress. Future results will help decide among the various models, or rule them all out.

A network for life

 

The Particle Training Network for European Radiotherapy (PARTNER) was established in 2008 to train young biologists, engineers, radio-oncologists and physicists in the various aspects of hadron therapy. This deceptively simple statement hides a vision that was truly innovative when the project started: to offer a multidisciplinary education in this cutting-edge discipline to train a future generation of experts who would be aware of the different scientific and technological challenges and move the field forward. PARTNER went on to provide research and training opportunities for 29 young scientists from a variety of backgrounds and countries, between 2008 and 2012. The publication of selected papers from PARTNER in the Journal of Radiation Research offers the opportunity to assess the research outcomes of the project.

As a Marie Curie Initial Training Network (ITN) within the European Union’s 7th Framework Programme (FP7), PARTNER was naturally focused on education, with a training programme encompassing science, technology and transferable skills (CERN Courier March 2010 p27). At the same time, the young scientists became engaged in research on a variety of topics from radiobiology to motion monitoring techniques, dosimetry, accelerators, computing and software tools. All of the research projects shared a focus on the impacts of clinical application, and many brought significant advances to the field.

Ingenious technologies

A key technology area is the development of affordable hadron-therapy installations. The next generation of accelerators should be smaller and less expensive. At the same time, they should allow fast, active energy modulation and have a high repetition rate, so that moving organs can be treated appropriately in reasonable time. PARTNER contributed to the design for the CArbon BOoster for Therapy in Oncology (CABOTO) – a compact, efficient high-frequency linac to accelerate C6+ ions and H2+ molecules from 150 to 410 MeV/u in about 24 m.

Gantries – the magnetic structures that bring particle beams onto the patient at the desired angle – are a major issue in the construction of carbon-ion facilities. The only existing carbon-ion gantry is installed at the Heidelberg Ion-Beam Therapy Center (HIT). It is a fixed, isocentric gantry 6.5 m tall and 25 m long, with a total weight of 600 tonnes. A design study supported by PARTNER and the FP7 project ULICE (CERN Courier December 2011 p37) proposed an innovative solution based on both the gantry and the treatment room being mobile. The isocentric gantry consists of a 90° bending dipole that rotates around the axis of the beam entrance, while the treatment room can move ±90°, thanks to an arrangement that keeps the floor of the room horizontal – like the cabin in a panoramic wheel (see figure). This design reduces the weight and dimensions of the gantry greatly and hence the overall cost.

Clever solutions are also needed to ensure the correct positioning of the patient for treatment. This is particularly important in the case of tumours that change position as organs move – for example, when the patient breathes. A standard technique to reposition the patient accurately at each treatment session involves the implantation of radiographically visible fiducial markers. These markers must not introduce imaging artefacts or perturb the dose delivery process. In particle therapy, however, the interaction of the therapeutic beam with the markers can have a significant impact on the treatment. In this context, PARTNER conducted a study at the treatment set-up at HIT to compare a range of commercially available markers of different materials, shapes and sizes. Some of the markers offered promising results and will soon be used in clinical routine, but the study highlighted that markers should be chosen carefully, taking into account both the tumour localization and the irradiation strategy.

The combination of image guidance with a mask-immobilization system was also investigated at HIT on patients with head-and-neck, brain and skull-base tumours. The study demonstrated that, for the same immobilization device, different imaging verification protocols translate into important differences in accuracy.

At the National Centre for Oncological Treatment (CNAO) in Pavia, PARTNER researchers carried out a comparative analysis of in-room imaging versus an optical tracking system (OTS) for patient positioning. The results showed that while the OTS cannot replace the in-room imaging devices fully, the preliminary OTS correction can greatly support the refinement of the patient set-up based on images, and provide a secondary, independent verification system for patient positioning.

State-of-the-art techniques are also needed for treatment planning – the tool that allows medical physicists to translate the dose prescribed by the oncologists into the set-up parameters for the beam. A PARTNER research project developed a novel Monte Carlo treatment-planning tool for hadron therapy, suitable for treatments delivered with the pencil-beam scanning technique. The tool allows the set-up of single and multiple fields to be optimized for realistic conditions for patient treatment, and also allows dosimetric quality assurance to be performed. Another study led to an accurate parameterization of the lateral dose spread for scanned proton and carbon-ion beams, which is currently in clinical use at HIT and CNAO.

Set-up errors and organ motion can influence the dose distribution during a treatment session. To deal with these potential variations, additional margins are applied to the tumour target, forming the so-called planning target volume (PTV). This procedure ensures that the tumour is irradiated entirely, but inevitably increases the dose delivered to the surrounding healthy tissues. PARTNER researchers studied the generation of a patient-specific PTV from multiple images and were able to achieve satisfactory control of possible target variations, with no significant increase in the dose delivered to organs at risk.

The great attraction of hadron therapy is the possibility of a precisely tailored dose distribution, which allows tumour cells to be hit while sparing the healthy tissues. Sophisticated measurements are needed to verify the actual dose delivered in a specific beam set-up, and air-filled ionization chambers are extensively used in this context. The conversion of data from the ionization chambers into standard dosimetric quantities employs a quality factor that accounts for the specificity of the beam. The ratio of water-to-air stopping power is one of the main components of this quality factor and – in the case of carbon-ion beams – its biggest source of uncertainty. PARTNER researchers developed a fast computational method to determine this stopping-power ratio, with results that were in good agreement with full Monte Carlo calculations.

Faster calculation methods are essential to re-compute the treatment plan quickly when needed, but they should not reduce the accuracy of the treatment planning. The PARTNER studies also demonstrated that a chamber-specific correction could be implemented in the treatment planning, bringing a small improvement to the overall accuracy of the verification of the plan.

Combining treatment modalities has become a standard approach in oncology, and it is important to understand how hadron therapy can fit into these combined treatment schemes. Within the PARTNER framework, three emerging treatment modalities were compared: volumetric-modulated arc therapy (VMAT), intensity- modulated proton beam therapy (IMPT) and intensity-modulated carbon-ion beam therapy (IMIT). Their combinations were also evaluated. The results clearly showed a better dose distribution in the case of combined treatments, but their actual clinical benefit remains to be demonstrated.

Biological factors

In the biological field, studies were performed to understand better the impact of hypoxia – oxygen deprivation – on cell survival, for various types of radiation therapy. Hypoxia is well known as one of the major reasons for the resistance of tumour cells to radiation. It also enhances the risk of metastatic formations. Understanding radioresistance is a key factor for more effective cancer therapy that will minimize local recurrences. Different levels of oxygen deprivation were studied, from intermediate hypoxia to total oxygen deprivation or anoxia. Cells irradiated under chronic anoxia turned out to be more sensitive to radiation than those under acute anoxia. Measurements also suggested that ions heavier than carbon could bring additional advantages in therapeutic irradiation, in particular for radioresistant hypoxic tumour regions.

The initial clinical experience at the CNAO facility provided the opportunity to study toxicity and quality of life for patients under the protocols approved by the Italian Health Ministry, namely for chordoma and chondrosarcoma. The preliminary results showed that all patients completed their treatment with no major toxicities and without interruptions, and that proton therapy did not affect their quality of life adversely. The assessment of quality of life in patients with these tumours is so far unique, as no other study of this kind has been published.

Side effects such as toxicity are an integral part of the information that determines the appropriate choice of treatment. Realistic, long-term data on such effects are difficult to obtain, mainly because of the limited duration of medical studies, so decision-making processes in medicine rely increasingly on modelling and simulation techniques. One of the PARTNER research projects focused on the implementation of a general Markov model for the analysis of side effects in radiotherapy, and developed a specific language to encode the medical understanding of a disease in computable definitions. The proposed method has the potential to automate the generation of Markov models from existing data and to be applicable to many similar decision problems.

Making optimal use of the available resources is a major challenge for the hadron-therapy community, with secure data sharing at the heart of the problem. The Hadron therapy Information Sharing Prototype (HISP) was developed within PARTNER to provide a gateway to patient information that is distributed in many hospital databases, and to support patient follow-up in multicentre clinical studies. HISP demonstrates a range of different and important features, and uses open-source software components that are important for the platform’s sustainable extension and potential for adoption.

The PARTNER network made important contributions to key research areas connected to hadron therapy, geared towards the optimization of this option for cancer treatment. A unique multidisciplinary training portfolio allowed more than 90% of the PARTNER scientists to find positions soon after the end of the project, thanks also to the expertise acquired at the most advanced European hadron-therapy centres and to the networking opportunities provided by the ITN. The medical doctors from India and Singapore went back to their countries and hospitals, while most of the other researchers are now working in hadron-therapy facilities in Europe, the US and Japan. The specific goal of training experts for upcoming and operational facilities was therefore successfully met, and the researchers ensure that the network lives on, wherever they are in the world.

• The PARTNER project was funded by the European Commission within the FP7 People (Marie Curie) Programme, under Grant Agreement No 215840.

François de Rose: strategist and visionary

François de Rose (left) and John Adams

Visionaries have the freedom of mind to shape the future when other people’s horizons are obstructed by the present. François de Rose was a visionary. In the aftermath of the Second World War, when Europe was in ruins and everything had to be rebuilt, the diplomat understood the importance of reviving fundamental research and, above all, of co-operation on a continental scale as the driving force of this ambition. In a Europe that was just starting to get back on its feet, it would be no mean feat. Nonetheless, François, alongside the prominent physicists of the time, put his energy into making this vision a reality. They lobbied governments for the creation of a centre that would work towards this goal and winning support, CERN was established in 1954 – an achievement of which François was extremely proud. “The result is even better than its founders hoped for,” he was often heard saying. His pride was even greater knowing that a visionary’s ideas, however strongly he or she believes in them, often take years to become reality and are sometimes never realized at all.

A strategist and a visionary to the end, François de Rose passed away on 23 March 2014 in Paris at the age of 103, having recently published his memoirs, Un diplomate dans le siècle. With his passing CERN has lost the last of its founding fathers, a loyal supporter and a dear friend.

Born in 1910 in Carcassonne in the south of France, he lost his right eye in a childhood accident, which prevented him from following the family tradition of a military career. His father, Charles de Tricornot de Rose, had been the founding father of combat aviation in France, the holder of the first military aviation licence, and had died in action in 1916.

After obtaining his baccalauréat, François embarked on a career as a diplomat and joined the French Embassy in London in 1937. He enjoyed recounting the splendid receptions that he attended at Buckingham Palace in the days when King George VI still ruled the British Empire and the future Queen Elizabeth II was just a child. Many years later, in the early years of the 21st century, it was fascinating to hear him tell anecdotes from his career of days long-passed, rather like reading an animated history book.

Presided by François, the first resolution for the creation of a European Council for Nuclear Research was adopted

During the Second World War, his fluency in English led him to serve as a liaison officer for the British military. However, it was after the conflict that his career took him down the route of European science – an unexpected detour for someone who had been discouraged by his maths teacher from pursuing a career in science. François was sent to the US to serve on the United Nations Atomic Energy Commission. There he met several renowned physicists, including the American Robert Oppenheimer, with whom he forged a friendship, and the Frenchmen Pierre Auger, Francis Perrin, Lew Kowarski and Bertrand Goldschmidt. François took up their cause. European physicists and some of their American counterparts were convinced that fundamental research in Europe needed to be brought back to life, and that this could only be achieved if the countries that had just been at war co-operated. The instruments that were needed to further the study of the infinitesimally small were particle accelerators, which were too expensive for any individual European country to build.

François and a handful of physicists embarked on a tour of Europe to appeal for the creation of the first European organization for fundamental research. Their objective was to pool resources for research to provide researchers with the tools that they needed and so curtail the brain drain. Pierre Auger, director of UNESCO’s Natural Sciences Department, organized an intergovernmental conference in Paris in 1951, presided by François, during which the first resolution for the creation of a European Council for Nuclear Research was adopted. The rest is history: CERN was established by 12 European states in 1954.

François became France’s delegate to the CERN Council and later served as president of Council from 1958 to 1960. In this capacity, he gave a speech at the inauguration of the Proton Synchrotron (PS), which for a few months was the most powerful accelerator in the world. His visionary nature was evident in this speech, which he gave in front of an audience of well-known faces and legendary physicists including Niels Bohr and Werner Heisenberg. “The people who will meet here,” he said of CERN, “who will come from the member states and beyond to work together on a wholly peaceful and impartial mission, are united by the same passion for knowledge and subject to the same rules of utmost intellectual integrity.” Today, CERN welcomes researchers from all over the world and its membership has recently been opened to non-European states, but a few years after its founding, such an international future was still a long way off.

During his mandate, François negotiated CERN’s extension into French territory, which was agreed in a treaty signed in 1965. To commemorate his role in this milestone, CERN gave François a piece of rock drilled from the site, engraved with the words: “À François de Rose – La science ne connaît pas de frontières” (“To François de Rose – Science knows no borders”).

François continued to pursue his diplomatic career for many years. Notably, he served as the French ambassador to Portugal from 1964 to 1969 and as the permanent representative of France to the NATO Council from 1970 to 1974. He was well known as a specialist in defence and nuclear matters. For a long time, he was an eminent member of the London-based International Institute for Strategic Studies, whose expertise in international strategy and military matters is world renowned.

The diplomat would remain attached to CERN, which he described as “the most beautiful feather in my ambassador’s cap”. He continued to take an interest in and show his enthusiasm for scientific discoveries, even in his final years. In 2010, when he came to CERN to celebrate his 100th birthday, he promised to return when the Higgs boson was discovered – a promise that he fulfilled last year with a further visit to the laboratory. During this last visit, he expressed with modest sincerity his great admiration for the physicists that he met – a mutual admiration that led to some often comical exchanges of compliments.

François had a strategic vision for science, a vision that drove him to contribute to CERN’s creation in the hope that scientific collaboration between countries that had been at war would play a part in maintaining sustainable peace. A humanist, he always used CERN to counter the arguments of the Eurosceptics. When he met some members of the French parliament during his visit to CERN last year, at the height of the European crisis, he said to them: “When Europeans unite, they can do great things.”

Optimistic and full of energy, he performed some substantial feats even in his later years. To mark his 90th birthday, he played 90 holes of golf in one day, and when he was 96, he travelled around Cape Horn to Patagonia with his two daughters. He regularly had opinion pieces published in major daily newspapers. To those who asked if he had a secret for reaching 100 years of age, he responded that it had simply required “patience, because it took quite some time”. He never failed to display elegance with a touch of humour, which charmed those who spoke to him. During his last visit, he promised to come back for the next big discovery. “But you’ll have to be quick,” he joked, “I won’t be around forever.” Sadly, he was right again.

The World Wide Web’s 25th anniversary

In March 1989 at CERN, Tim Berners-Lee submitted his proposal to develop a radical new way of linking and sharing information over the internet. The document was entitled “Information Management: A Proposal”. And so the web was born. Now, Berners-Lee, the World Wide Web Consortium (W3C) and the World Wide Web Foundation are launching a series of initiatives to mark the 25th anniversary of the original proposal, and to raise awareness of themes linked to the web, such as freedom, accessibility and privacy.

Twenty-five years to the day after he submitted his proposal, on 12 March Berners-Lee, together with the Web Foundation, launched the “Web We Want” campaign. The aim is to promote a global dialogue and changes in public policy to ensure that the web remains an open, free and accessible medium, so that everyone around the world can participate in the free flow of knowledge, ideas and creativity online.

Berners-Lee announced the campaign at the Palais des Nations in Geneva on 10 December – Human Rights Day 2013 – during a series of conversations on a variety of issues in human rights, which were held in celebration of the 20th anniversary of the Office of the High Commissioner for Human Rights. There he set out the principles that inspire the movement for a free flow of information, such as affordable access, protection of privacy, freedom of expression, and neutral networks that do not discriminate against content or user.

Fittingly, the campaign is using the web to pass on the message, and it has already seen significant mobilization on social media with half a billion people worldwide hearing Berners-Lee’s call for a digital bill of rights in every country. CERN promoted the launch of the campaign on its website with a series of opinion pieces from early contributors and enthusiasts of the World Wide Web, which are republished here.

On the open internet and the free web

The internet created the platform and opportunity for people to communicate, to collaborate and to share at unprecedented scale and speed. The creation of the World Wide Web opened up these possibilities to the world, enabling individuals to participate and play their own creative role in the sharing of all human achievements.

This has enabled interactions between all sorts of people – from all sorts of domains, including business, government and scientific communities – for all manner of activities like never before in human history. The web has evolved from simple information sharing to transacting business through socializing and more recently collaborative problem solving in citizen cyber science. In these ways it harnesses the capabilities of humanity to do what we do best – share, learn, collaborate and innovate.

However, with this capability comes considerable responsibility. Basic human rights – including the right to freedom of expression and the protection of privacy – all need to be balanced and preserved in order that this incredible resource can be a safe and exciting place for creativity, for people of all ages and interests. The accessibility and openness of the internet are crucial to enabling new ideas to flourish and compete with long-standing traditions, and to ensure that the evolution of the web continues to proceed at a pace limited only by our ideas.

This responsibility rests with all of us – whether politicians, lawmakers, scientists or citizens – to ensure that the incredible progress we have made in the last 25 years, starting with the work of a few, and now capturing the innovations of many, can continue in an open, trusted, safe, free and fair way.

David Foster, Deputy Head of CERN’s IT department.

Minimizing the muddle

Reams of material have been written about where, why and when the World Wide Web was born, but what about its conception? Gestation was rather like that of an elephant – difficult to know it had started and taking almost two years to complete. In fact, I think the title of Tim Berners-Lee’s book Weaving the Web, published in 1999 with Tim dubbed the inventor, is a better metaphor. When do a spider’s first few threads become a web? And when, if ever, is the job finished?

In 1984, Tim was recruited by CERN’s Data and Documents (DD) division and he elected to join the Read-Out Architecture (RA) section in the On-Line Computing (OC) group. I was the RA section leader and Tim worked with (and without!) me for the next six years. Mike Sendall, the OC group leader, agreed our work plans and held our purse strings.

At the time, CERN hosted lots of small and medium-sized experiments using a variety of mini-computers, personal computers, operating systems, programming languages and network links. Back at the ranch, the OC group was endeavouring to provide data-acquisition systems, the software used by equipment closely connected to the detectors, for as many experiments as possible. The conundrum, as in other areas, was how to embrace heterogeneity without having squads of workers generating exclusive solutions to intrinsically identical problems for bewildered users. Just the kind of anarchic jumble that Tim found challenging.

Several of us believed that standardization, where apt, reduced waste and frustration. But the s-word was anathema in some corners of CERN, on the grounds that it stifled creativity, and we evangelists incurred the wrath of a few mandarins. Yet conformity seemed to rankle less when it came to electronics. Commercial companies were already competitively producing computer interfacing hardware that conformed to ANSI/IEEE international standards.

Hurrah! If you know the hardware you’re going to get, you can prescribe how to handle it. I had worked with the NIM (US)/ESONE (Europe) group that defined standard software routines for CAMAC interfacing and was on the committee developing hardware and software standards for the speedier FASTBUS system. Tim arrived as we were dotting the Is and crossing the Ts of the FASTBUS routines.

He was obviously a smart young man (smart-clever rather than smart-sartorial!), full of fizz and, as a bonus, entirely likeable. When he presented his ideas in our section meetings, few of us if any could understand what he was talking about. His brain would overtake his voice, and holding up signs saying “Tim, slow down” rarely had the desired effect. We sometimes asked him to put things in writing, which didn’t necessarily help either. One of his erstwhile colleagues recalls “we knew it was probably exciting, maybe even important, but that it could take hours to figure out”. Listening to one of Tim’s presentations today, one can still detect the run-away style, even after his training in public speaking. However, I remember an occasion when his delivery was impeccable, in a play performed by the Geneva English Drama Society!

Tim’s main activity in the RA section was his Remote Procedure Call RPC, whereby a program on one computer could transparently access procedures, routines, on other computers, even if they used different operating systems and programming languages, and whatever the network connecting them. He wasn’t too pleased when I asked him to specify the FORTRAN binding for FASTBUS routines, that is to define precisely the properties of the routines’ parameters as seen from within a FORTRAN program. Only later did he appreciate the value of that unwelcome task, when preparing the standards that would underpin the first two Ws of WWW. He knew that the job, however tedious, had to be done and done well, with the devil lurking in the nit-picking details.

Shortly afterwards I drifted away from ECP, but I will always retain happy memories of the 1980s and the pleasure of having Tim in our section

Come 1990, another CERN reshuffle and Tim stayed behind in the new Computing and Networks (CN) division, while the rest of us went off to Electronics and Computing for Physics (ECP). Shortly afterwards I drifted away from ECP, but I will always retain happy memories of the 1980s and the pleasure of having Tim in our section. He was not the only singular character in that multifaceted team, but with his congenial personality he could work with anyone. At least I don’t recall having to field any complaints, apart from “what on earth is Tim proposing?” Well, now we know.

Peggie Rimmer, Tim Berners-Lee’s supervisor from 1984 to 1990.

Good old Bitnet, and the rise of the World Wide Web

Although I presented my PhD thesis a mere 17 years ago, the last back-up of my thesis, programs and data was saved on a 7-inch magnetic tape reel. This of course meant that I did my graduate studies at the time when the word “network” was most often used in the plural. Each and every network was endowed with its own set of applications and accessibility for e-mail, document exchange, remote interactivity and even chatting.

Yes, computer-mediated social interaction came long before the World Wide Web. In the late 1980s, connectivity exploded at universities and research laboratories around the world. One noticeable side product was that young academics started dating each other from across the globe!

All of this was a heterogeneous mess, of course. But at the same time, it was pleasurably low level, and it was awesome. You knew what was happening behind the scenes when retrieving data and documents, you knew the hops that your “Relay” instant messaging made on the Bitnet, because you simply had to know. Data, documents, social interaction – it was all there. It was cool and in some ways efficient, but not practical, and it scaled very poorly.

And so the World Wide Web arrived on the internet. With the web came an immediate sense of need: you needed a fancy personal homepage, complete with graphical interface and colour. The personal homepage was quickly perceived as a way of asserting one’s very existence. I was on a text-based, black-on-orange remote terminal, and I still remember putting together my first homepage late at night in early 1993, while one of the graphical stations was free in the research group.

The web was practical and universal, and the other networks quickly withered away in a form of Darwinian selection. The web quickly drove the quest for desktop computer stations with screens with graphics capability. I still opted for size and sharpness, staying with black and white for several years, while all of my colleagues seemed to be rubbing their sandy eyes after only a few hours of 15-inch colour experience.

The web brought a singular revolution that quickly changed every aspect of our screen work: a global, all-topic search possibility

The web brought a singular revolution that quickly changed every aspect of our screen work: a global, all-topic search possibility. Computer code, a formula, a result, a cooking recipe, a person, a phone number – everything was at hand in little more than an instant, with no physical displacement. We immediately started setting up analysis team pages to share progress more efficiently. I was in the DELPHI experiment Team 5, the “Higgs hunters” team. It was mostly pages with some expert documentation and links to plots, programs and data, but we also all invented countless ways to make information on the web dynamic. It took time and pain before it deserved the word interactive.

Today I sometimes have the impression that no development is ever made without constantly interrogating the web for advice, before even thinking through the problem: “Someone will surely have solved the problem in a better way, no?” is an all-too-common approach.

In those early days I rarely discussed my networked profession and life with friends and family – the web was just a new tool of my trade. After another long stay at CERN in 1993–1994, I went back to Stockholm in February 1994. Sitting quietly reading on the subway, it was with an indescribable surprise and awe for what was to come that I discovered an http address on a regular advertisement! Within months, commercial web addresses were all over our billboards in Sweden.

Back then, the good old Bitnet chat had a rule. The Dutch Master Operators insisted that “Relay is a ‘privilege’, NOT a right, and Relay abuse will NOT be tolerated!” I often wish the web had it too, including commercial boundaries under the same heading.

Richard Jacobsson, senior physicist on the LHCb experiment.

Not at all vague and much more than exciting

In 1989, when Tim Berners-Lee invented the World Wide Web at CERN, I was responsible for the laboratory’s multi-protocol e-mail gateway. I remember discussing with Tim naming conventions for applications, and configuration rules for the first mailing lists that he requested to allow pioneer websites to discuss World Wide Web code.

We attended technical meetings sponsored by the European Commission – myself for e-mail standardization, and Tim for the Information Services Working Group (WG) – where he presented his code, and some Scandinavian universities even showed an interest in installing it.

Tim conceived, wrote and presented the web as an open, distributed, networked medium. He believed that the web should be accessible by everyone, everywhere – embracing from the first web conference at CERN in 1994 development for people with disabilities or a sub-optimal network infrastructure. He presented the web – in his proposal to CERN in March 1989 – as a platform for scientific collaboration, and 20 years later reinforced this commitment, announcing http://webscience.org as a home for scientists online.

And Tim Berners-Lee continues to strive for a free, open web today. Setting up the World Wide Web Foundation was just one of the many steps he took to maintain this ideal. On 12 May last year, at the United Nations in Geneva, Tim announced the Web We Want campaign, which will form the centre of the debate around today’s information-surveillance methods.

Up until 1998, in the Web Office at CERN, we were still able to count all the world’s web servers

As a CERN scientist, I share Tim’s ideas for an open, collaborative web. I believe that CERN’s software development based on web standards should be linked to the relevant working groups in the World Wide Web Consortium (W3C) – the main international standards organization for the World Wide Web.

Up until 1998, in the Web Office at CERN, we were still able to count all the world’s web servers. We still thought we could keep track of the web’s expansion. Apache put an end to this, as starting one’s own web server became so easy. But we were still writing search algorithms of our own, with integrated dictionaries for natural language searches, with help from technical students. We enjoyed, at the time, a certain pluralism, because we had multiple commercial or public-domain products to compare and evaluate, search engines, web calendars and editing tools. We didn’t use “Google” as a synonym for “search”.

The explosion of websites around the turn of the century highlighted the importance of identifying trustworthy information online. At CERN, we understand that presence on the web doesn’t necessarily make information valid – it must be recent and from a trusted source. Sophisticated algorithms are developed to promote web content by devious means, such as clever use of metadata to “arrange” the importance of search results, spread false rumours, manipulate public opinion. Browsing today requires a discerning eye and a knack for research.

Today, CERN software developers write grid middleware, data-management software, collaborative tools, repositories for data-preservation projects and web-based applications. They use, among other standards, the http protocol. A collaboration with relevant W3C working groups would lead to technical benefits in these times when resources are limited and the web has become much more than a document repository.

The web has changed human society more radically than Gutenberg’s printing press. It is a valuable platform for education and free exchange of ideas. But it can also be a tool for propaganda and surveillance.

Now more than ever, we at CERN should keep in touch with the evolution of the web: after all, it changed the world as we know it at the end of the 1980s – it could do so again.

Maria Dimou, CERN computer scientist and early web contributor.

Origins: the early days of CERN

Francois de Rose

1946 a commission of the United Nations Security Council was entrusted with the task of making proposals to bring atomic energy under international control. It was one year after the devastation of Hiroshima, and the idea of such control had been approved by all the governments. The commission was made up of influential scientists who had the knowledge that was needed to understand the problem fully and of politicians and diplomats representing the governments’ interests. It was in this capacity as a diplomat that I represented France on the commission and was able to establish trusting and friendly relations with many of my countrymen who were scientists, as well as with foreign scientists, first and foremost among whom was Robert Oppenheimer, who was to play a very important role in the creation of CERN.

In the course of the many conversations I had with Oppenheimer in the US, in which we were often joined by other Frenchmen, who were my scientific and technical advisers, he confided his worries about the future development of fundamental physics in Europe. “Almost all we know, we have learnt in Europe” is the substance of what he said. He himself had been a pupil of Niels Bohr in Copenhagen. “But in the future,” he continued, “research is going to require industrial, technical and financial resources that will be beyond the means of individual European countries. You will therefore need to join forces to pool all your resources. It would be fundamentally unhealthy if European scientists were obliged to go to the US or the Soviet Union to conduct their research.”

Early in 1950, convinced by this argument, Francis Perrin, then high commissioner for atomic energy in Paris, and I began to visit the main European research centres that would need to be persuaded. We met with a favourable response from Edoardo Amaldi in Italy, Niels Bohr in Copenhagen, Paul Scherrer in Switzerland and possibly Werner Heisenberg in Germany, if I remember correctly, but we were given a cooler reception in other capitals. Nevertheless, the idea was now on the table and was no doubt starting to take root in people’s minds. Moreover, it came on top of an appeal on similar lines from the European Centre for Culture in Geneva, led by Denis de Rougemont from Switzerland and Raoul Dautry from France. It was then that Isidor Rabi, a Nobel prize winner, made his crucial speech at the UNESCO General Conference in Florence in June 1950. Speaking on behalf of the US, he more or less said the same thing that Oppenheimer had said to us in private.

This speech marked a definite turning point, persuading the majority of European scientists and their governments to adopt a resolution authorizing UNESCO to “assist and encourage the formation and organization of regional centres and laboratories in order to increase and make more fruitful the international collaboration of scientists”. Pierre Auger, UNESCO’s director of natural sciences, took matters in hand and, at the end of 1951, managed to organize a conference of all European scientists and government representatives, which I had the honour to chair and at which it was decided to establish the European Council for Nuclear Research.

It should be realized that, in the wake of Hiroshima, people were afraid of science and of nuclear science in particular

The fundamental ideas, namely the goals that all the pioneers of what was to become CERN set themselves, consisted first of all in promoting European co-operation in this vital area. CERN was thus the first venture on a European scale and I can say that Robert Schuman, who was then French minister of foreign affairs and one of Europe’s founding fathers, was immediately in favour of it. A second goal was to reintroduce complete freedom of communication and the sharing of knowledge into this branch of science.

It should be realized that, in the wake of Hiroshima, people were afraid of science and of nuclear science in particular. “The physicists have known sin” said Oppenheimer, and the consequence of using scientists’ work for military purposes was the imposition of secrecy and the lack of communication between research centres. By immediately taking the opposite approach to fundamental research in its statutes, CERN was following the great tradition of science knowing no boundaries. The ambitions of these pioneers were more than fulfilled, since CERN is today home to scientists from all over the world, including the US, China, Japan and Russia, all working together and in teams on the same research, the results of which are published in full.

Another of my memories concerns the extension of the CERN site into France. After the construction of the 28 GeV Proton Synchrotron, it soon became apparent that, in the time-honoured fashion, this was only a scale model of more powerful machines to come. The area that Switzerland had been able to set aside for CERN could not be extended on the Swiss side. Luckily, the site ran alongside the border with France, and the land in that area was essentially being used for farming. The continuation and development of CERN’s activities were therefore dependent on extending the site into France, thus requiring a parcel of around 500 hectares of French land to be made available to an international organization with its headquarters in Switzerland. I prepared a dossier, which was submitted to the then French president, General de Gaulle, by the minister of foreign affairs, Maurice Couve de Murville. That is how CERN became – and I think remains to this day – the only research centre to straddle the border of two countries.

bright-rec iop pub iop-science physcis connect