By Fayyazuddin and Riazuddin World Scientific
Hardback: £54 $82
The Pakistani brothers, who were both students of Abdus Salam, wrote the first edition of their book in 1992, based on lectures given in various places. Aimed at senior undergraduates or graduate students, it provides a comprehensive account of particle physics. Having first been updated in 2000, this latest edition contains many revised chapters, in particular those that cover subjects such as heavy flavours, neutrinos physics, electroweak unification, supersymmetry and string theory. Another addition is a substantial number of new problems. This self-contained book covers basic concepts and recent developments, as well as overlaps between astrophysics, cosmology and particle physics.
In 2009, CERN’s Proton Synchrotron (PS) reached its half century, having successfully accelerated protons to the design energy for the first time on 24 November 1959. Still in operation more than 50 years later, it is not only a key part of the injection chain to the LHC but also continues to supply a variety of beams to other facilities, from the Antiproton Decelerator to the CERN Neutrinos to Gran Sasso project. During its operation, the PS witnessed big changes at CERN; at the same time, particle physics itself advanced almost beyond recognition, from the days before quarks to the current reign of the Standard Model.
At the close of the anniversary year, CERN held a symposium in honour of the accelerator developments at CERN and the concurrent rise of the Standard Model: “From the PS to the LHC: 50 years of Nobel Memories in High-Energy Physics”. Fittingly, at the end of 2009, the LHC – the machine that everyone expects to take the first steps beyond the Standard Model – was just beginning to come into its stride after the first collisions in November.
Key players who had been close to all of these developments, including 13 Nobel laureates, came together for the symposium. Now, several of the talks have been written up and published in the latest edition of The European Physical Journal H – the journal launched in 2010 as a common forum for physicists, historians and philosophers of science. The edition also includes three additional articles that were invited to provide a more complete picture, by covering CERN’s Intersecting Storage Rings, the history of stochastic cooling and searches for the Higgs boson at the Large Electron-Positron (LEP) collider – which started up in 1989 and hence celebrated its 30th anniversary at the symposium.
Dip into the pages and you will find many gems: among the Nobel laureates, Jerome Friedman describes the work at SLAC that revealed the reality of quarks, which were unheard of in 1959; Jim Cronin revisits the early 1960s when he and his colleagues discovered CP violation; Jack Steinberger looks back at early experiences at CERN; Carlo Rubbia presents the story of the discovery of W and Z bosons at CERN; and Burt Richter recalls early ideas on LEP, from his days on sabbatical at CERN. On the accelerator side, the articles detail developments with the PS, as well as the highlights (and lowlights) of the construction and running of LEP. The invited article on stochastic cooling includes the work of Simon van der Meer, who shared the Nobel prize with Carlo Rubbia in 1984. Sadly, he was too ill to attend the symposium and passed away in March 2011.
All of the articles provide an interesting view of remarkable events through the reminiscences of people who were not simply “there”, but who played a big part in making them happen. They are a fascinating reminder of what particle physics was like in the past and well worth a read. They also reflect the different styles of the various individuals, but not so much, perhaps, as did the original presentations at the symposium. To get the full flavour, and to see all the participants, take a look at the recordings. There you will find still more gems.
Towards the end of July 1958, at a house in the hills south-east of Rome, three Italian scientists discussed key ideas that were to form the foundations of the European Space Agency (ESA). Edoardo Amaldi, who had been instrumental in the establishment of CERN four years previously, was with Giorgio Salvini – whose house it was – and Gino Crocco, who was Goddard Professor of Jet Propulsion at Princeton in the US. During their conversation, the old friends discussed how European countries, in particular Italy, could become involved in space research. Only the previous October, the Soviet Union had opened up the space age with the launch of the first artificial satellite, Sputnik 1. This had been followed in January 1958 by Explorer 1, launched by the US. So what could Europe do?
As Salvini recalls, the conversation was “long and animated”. While Crocco was sceptical about what Italy could achieve, Salvini was more optimistic, and Amaldi, with all of his experience in setting up CERN, saw the case for an organization that would enable European countries to work together on research in space. In particular, Amaldi insisted on two points: that there should be no military involvement and that such an organization should be based on the successful model that had given rise to CERN.
At the end of the year, Amaldi wrote to Crocco at Princeton, describing the contacts that he had made in the meantime with some influential scientists. In the letter, Amaldi went on to describe how he thought the project to launch a “Euroluna” (“Euromoon”) satellite for scientific research should take shape. The letter makes clear his insistence that the underlying organization should not be linked to the military but should be purely scientific and based on the same principles as CERN.
Amaldi insisted on two points: there should be no military involvement and the organization should be based on the model that had given rise to CERN.
As a starting point, Amaldi suggested that a small group of experts from the major European countries could prepare a plan for creating an appropriate organization. By early 1959 he had discovered an ally in an old friend, Pierre Auger, the French cosmic-ray physicist who had also been involved in setting up CERN. By May, after several interactions with Auger, Amaldi had written the first draft of his paper, Space Research in Europe, with the aim of stimulating discussions on the formation of a European organization for space research. A French version, together with supportive coments from several countries, was distributed in December (Amaldi 1959).
In Amaldi’s original vision, not only the development of the satellites – the “Eurolunas” – but also that of their launchers would be the responsibility of the organization, which would need experts in the technology and engineering of rockets as well as space scientists. The idea was to mirror CERN, which had accelerator physicists and engineers to build its own machines for the high-energy-physics community to use in scientific research. By collaborating at CERN, Europe’s scientists had access to accelerators that no country had the means to build on its own.
It soon became clear that this vision was not to be, albeit not to begin with. There was too much political and commercial interest surrounding the construction of rockets. Governments, in particular the British and French, began the negotiations that would separate the business of building launchers from that of making the satellites for scientific research. On 29 March 1962 in London, seven countries – Belgium, France, Germany, Italy, the Netherlands, the UK and Australia (associate member) – signed the convention that created the European Launcher Development Organisation (ELDO). Three months later, on 14 June 1962 in Paris, Belgium, Denmark, France, Germany, Italy, the Netherlands, Spain, Sweden, Switzerland and the UK signed a different convention, in this case to create the European Space Research Organisation (ESRO).
The foundation of these separate bodies may have been counter to Amaldi’s vision for an organization similar to CERN but they were the forebears of ESA, which was established in May 1975. With the formation of ESA, the science and the means to do it were brought into the same fold.
Amaldi’s letter to Crocco, which is translated from Italian on the following pages, constitutes the first document in which a European space organization is mentioned. It is for this reason that 10 copies recently went into space on board a spacecraft taking essential supplies to the International Space Station (ISS). ESA’s 3rd Automated Transfer Vehicle (ATV), named in honour of Amaldi, arrived at the ISS on 29 March 2012, exactly 50 years to the day of the convention to create ELDO being signed in London. Appropriately, the ATV had been launched by an Ariane rocket built by ESA. The copies of the letter will be signed by the astronauts and brought back to Earth by a Soyuz spacecraft. One will be given to CERN.
Amaldi’s 1958 letter, translated
16 December 1958
Prot. No 4674/A
Distinguished Prof.
Gino Crocco
College Road 74
PRINCETON – N.J.
Dear Gino,
After our discussion at Salvini’s home in Rocca di Papa at the end of July, I thought over the possibility to develop an appropriate activity in Europe in the field of rockets and satellites. It is now very much evident that this problem is not at the level of the single states like Italy, but mainly at the continental level. Therefore, if such an endeavour is to be pursued, it must be done on a European scale as already done for the building of the large accelerators for which CERN was created.
The launch of one or more “Euroluna”, performed by a dedicated European organization, would definitely be of the highest importance, both moral and practical, for all the nations of the continent.
With these ideas in mind, at the end of July I wrote a letter to [Luigi] Broglio who replied, at the end of August, expressing his substantial agreement with the theoretical formulation of the problem but also a considerable scepticism with regards to the practical feasibility of an actual project.
During the Conference of Geneva, held in the first half of September, I had the opportunity to discuss it with [Isidor] Rabi who reacted very positively and stated that, if this would have developed further, he would have done everything possible for obtaining the support of the United States. Actually, himself being a representative of the United States in the NATO Science Committee, he thought that this could be the initiating body for this activity; however, I think this wouldn’t be appropriate, as I shall explain later.
In November I spoke to [Harrie] Massey of [University College] London who, however, was rather sceptical; though this is the normal British attitude in front of any continental initiative.
At the beginning of December I spoke about the matter with [Francis] Perrin who was very interested and convinced and he promised me to look for some competent people in this specific field in France that could flag the problem.
The idea I have about this organization is that, in addition to the six EURATOM nations, Britain and the Scandinavian countries should participate in the manufacturing of satellites. Britain would at first limit itself to sending some observers and would probably show some resistance, but would certainly end up contributing substantially, would the project start taking shape.
It should, in my opinion, proceed as follows: some authoritative expert in the field (Broglio I hoped, but he seems not to have the necessary enthusiasm) should start flagging the problem and obtaining some level of participation of one or two experts of the largest European countries. Some Italian, French and German experts would be needed to start. These five or six people should prepare, within a few months, a plan of technical development containing :
1) a very well defined scope which should be so ambitious to be comparable with the targets that the USA and the USSR have set for themselves in this field, and in order to justify the European character of the endeavour;
2) an assessment of the cost and its time distribution;
3) an assessment of the specialized workforce;
4) a realistic time frame.
Such programme should be submitted to the governments for approval and for the resulting creation of the final organization which should be provided with the necessary resources.
In the case of CERN, things essentially developed as mentioned above; however, that case took advantage of the existence of UNESCO which, by calling the representatives of the governments to a first conference, played the role of the mother and nurse of CERN. I do not know who could be the mother and nurse of the new organization; according to Rabi this could be the “Science Committee” of NATO, but I believe that it wouldn’t be the best mother for such organization. As a matter of fact, I think that it is absolutely imperative for the future organization to be neither military nor linked to any military organization. It must be a purely scientific organization open, like CERN, to all forms of co-operation both inside and outside the participating countries. I have the impression that all attempts to set up international organizations of a military nature have either failed or, if they didn’t fail, present such characteristics that do not minimally satisfy even their own promoters and managers.
The high level start-up project should include :
a) the construction of common European laboratories for solving the various major problems,
b) a related research programme to be run in the participating countries.
Through either one or the other of these activities, the individual countries would have all the technologies at their disposal, and therefore their scientific-technical structure would be greatly strengthened. Such strengthening would bring, evidently, great advantages also in the military sector in case the defence activity would be necessary but it wouldn’t make the realisation of the programme more difficult and complicated as would occur if the military, directly or indirectly, were the masters.
The financial problem, definitely irresolvable within the economy of one single country, could be solved in the context of the European continent.
The problem of the specialized workforce constitutes a second difficulty, but I believe that this could be solved in such a project; this would have the double advantage of attracting the liveliest part of the new generation and making it possible to recover academics who work outside Europe.
I would like to ask you to think about what I wrote here and to reply, as soon as possible, to the following questions which, in a more or less direct manner and on different levels, are related to the project mentioned above :
1) I would like to know whether you are interested and whether you would like to take an active role or even the leading role in it. Personally I don’t want to be involved in all of this except for launching the idea, at this stage, and later – in a few years – if the idea becomes reality, for participating in collecting the scientific data which can be obtained with this kind of activity;
2) I would like to know from you the names of the most competent and open persons in this field in Italy, France, Germany, Great Britain and in the Scandinavian countries. As I already told you, I contacted Broglio since July, but he seemed to be too sceptical for taking this route for the moment at least;
3) I would like to know which organizations, even of modest size, exist in Italy in this field and can provide an absolute guarantee of trust; for example, I came in contact with SAMI’s engineer Salvatore but I have no idea of neither the value and competence of this person nor the robustness of the company. The seriousness of the people is a very fundamental issue; this venture is destined to fail, if people who are not sufficiently trustworthy slip into the initial organization committee.
Furthermore, I would like to have von Karman’s address; Rabi asked me permission to speak to him about this and I agreed, but I don’t know if he actually did it and whether this would be of any help. I would like to have your opinion on this subject too; nevertheless, I think that an authoritative person like him could, if favourable, have a considerable influence.
I believe that you will be very much surprised by this letter of mine; it is based on my experience with CERN: in 1952 only three or four persons in the whole of Europe believed in the possibility of creating CERN, but in 1958 the laboratories in Geneva have exceeded 800 workers, the first machine has started running giving first class scientific results and the second machine will work before mid-1960.
I believe that, if the European experts in the field of rockets and satellites start moving already now, they will be in a condition, together with the American and Russian groups, to contribute very substantially to the study of space by 1965.
I take this opportunity for sending you and your wife my best wishes, including among them the wish for a Euroluna before 1965.
By Richard J Szabo Imperial College Press
Hardback: £42 $68
E-book: $88
Originally published in 2004, this book provides a quick introduction to the rudiments of perturbative string theory and a detailed introduction to the more current topic of D-brane dynamics. The presentation is pedagogical, with much of the technical detail streamlined. The rapid but coherent introduction to the subject is perhaps what distinguishes this book from other string-theory or D-brane books. This second edition includes an additional appendix with solutions to the exercises, thus expanding the technical material and making the book more appealing for use in lecture courses. The material is based on mini-courses in theoretical high-energy physics delivered by the author at various summer schools, so its level has been appropriately tested.
By David Goodstein (eds.) World Scientific
Hardback: £57 $86
E-book: $112
This up-to-date collection of review articles offers a general introduction to cosmology by experts in various fields. It starts with “Galaxy Formation from Start to Finish” and ends with “The First Supermassive Black Holes in the Universe”, exploring in between the grand themes of galaxies, the early universe, the expansion of the universe, neutrino masses, dark matter and dark energy. Together the chapters provide a detailed view of what is known about the universe as well as what remains unknown. Students, researchers and academics interested in cosmology should find this book useful.
By Alessandro De Angelis Springer
Paperback: £19.99 €24.44
In telling the story of “the enigma of cosmic rays”, physicist and enthusiastic communicator Alessandro De Angelis traces the fascinating adventure of cosmic rays since their discovery a century ago. Today, the exploration of the mysteries of cosmic rays continues with even more powerful tools in a range of energies that extends 20 orders of magnitude.
Cosmic rays have always been puzzling. In the first decade of the 20th century, physicists were seeking a solution to the problem of why gold-leaf electroscopes – instruments that are still common in laboratories in schools today – discharge spontaneously. Many scientists faced this problem, including an Italian, Domenico Pacini, who made some important measurements by immersing his instruments under water at different depths and observing a marked decrease in the discharge rate. Indeed, Pacini was the first to give a clear indication that part of the natural radiation he detected came from the atmosphere and from the cosmos. However, his results were published only in Italian and had no great prominence – although Viktor Hess did mention Pacini several times in his speech when he obtained the Nobel Prize in Physics for the discovery of cosmic rays. Pacini’s work is yet another glaring example of a discovery that has not obtained the international recognition it deserves.
The riddles of cosmic rays do not end there. We still do not know for sure where they come from. They are deflected by the interstellar magnetic field so their direction of arrival cannot be connected to their starting point. Above all, we still struggle to understand what mechanism provides them with an energy that can in extreme cases reach the energy of a tennis ball concentrated in a single atomic nucleus. Enrico Fermi proposed a theory for the acceleration of cosmic rays that explains in part what is observed. However, there is still much to understand and we hope that recent and future results in high-energy astrophysics will be able to answer this fundamental question.
What is sure is that cosmic rays bring to the Earth pieces of the far-away universe. Furthermore, their high energy makes them interact with the atmosphere, producing secondary particles – as in powerful particle accelerators. For this reason, in the first half of the past century cosmic rays revealed the first particle of antimatter – the positron – and many new particles that led to the birth of elementary particle physics before accelerators made by humans turned it into a mature science. Even today, in the LHC era, the study of high-energy cosmic rays and the precision testing of their composition at intermediate energies are active fields of research, with experiments on Earth and in space. In particular the first evidence of neutrino oscillations – and thus of their mass – was observed by studying the secondary neutrinos produced by cosmic rays in the atmosphere.
This book by De Angelis traces the history of the study of cosmic rays in a documented, comprehensive way, often providing details both interesting and little known. It is easily readable and an excellent reference for anyone interested in fundamental physics and contemporary astrophysics.
By Jacques Vanier Imperial College Press
Hardback: £74 $120
Paperback: £33 $54
In this book, Jacques Vanier gives a comprehensive picture of the physical laws that appear to regulate the functioning of the universe, from the atomic to the cosmic world. It offers a description of the main fields of physics as applied to the atomic world and the cosmos, to describe how the universe evolved to its present state. This is done without equations, except for a few, although there is a short annexe for readers who wish to see how the principles and laws expressed in words can be visualized in the language of mathematics. The author also occasionally uses two young people placed in various situations to explain aspects of physics through their observations.
Take a 27-km, record-breaking machine, with 10,000 scientists from 100 countries and 630 institutions, throw in selected artists and arts specialists, and what do you get? An experiment to bring about head-on collisions between things that are even more elusive than the Higgs Boson – creativity, imagination and human ingenuity. Without them, science, art and technology would not exist. The name of this experiment is Arts@CERN, and last year saw the switch-on of this new and rather different collider at CERN.
The start-up has seen CERN collaborate in the world’s most prestigious digital-arts festival, Ars Electronica, in Linz; feature in the keynote event at the Agenda 2016 conference at Moderna Museet, in Stockholm; supply live footage from the LHC to the US film director David Lynch for the Mathematics exhibition at one of the world’s leading contemporary arts museums, Fondation Cartier, in Paris; and have its research into antimatter feature on the centre spread of China’s best-selling design magazine.
Other results of the arts switch-on involve specially curated visits to CERN’s facilities for leading international artists. Recently these included the Swiss video artist Pipilotti Rist, the Polish conceptual artist Goshka Mocuga and the master of contemporary dance, the US choreographer William Forsythe, as well as up-and-coming young artists, such as performer Niamh Shaw from Ireland. And to cap it all, this year CERN has two artists in residence on the new, three-year international artists’ residency programme, Collide@CERN, which is funded and supported by external donors and partners.
This all seems a long time since 2009, when I was given the opportunity to go anywhere in the world after I received the Clore Fellowship – an award for cultural leadership. Instead of taking the opportunity to work in a famous arts organization, I decided to approach CERN to come for three months, supported by the UK Government who funded my award, to carry out a feasibility study for an artists’ residency scheme. Little did I know that I would be hired in the spring of 2010 to build a p(art)icle collider for CERN.
So why should CERN engage with the arts? CERN has a mission to engage science in society. The arts reach areas that science and technology alone cannot reach – touching the public who might otherwise be turned off. By joining forces, arts, technology and science make an unbeatable force for change and innovation in the 21st century, as Eric Schmidt, now executive chair of Google, points out. In the words of CERN’s director-general, Rolf Heuer: “They are expressions of what makes us human in the world.”
This phrase, more than any other, shows what is behind CERN’s high-level engagement with the arts and can be summed up in a simple equation: arts + science + technology = culture. For an organization to be truly cultural and innovative in the 21st century, it has to embrace all factors and facets of human experience, engaging with them on the same level of excellence as its institutional values.
Science and the arts are intimately connected in other ways, too. The British sculptor Antony Gormley is one of several leading international artists who are the patrons of the Collide@CERN artist in residence scheme. He recently donated one of his pieces, Feeling Material, to CERN in acknowledgement of the inspiration of particle physics on his work; it now hangs in the Main Building. Gormley is clear about the connection between art and science: “My whole philosophy is that art and science are better together than apart. We have somehow accepted an absolute division between analysis and intuition but I think actually the structures that they both come up with are an intricate mix of the two.”
The showpiece event that signalled the switch-on of CERN’s arts experiment was the six-day Ars Electronica Festival in Linz in 2011. Being the world’s leading digital-arts festival, it features spectacular performances in and around its state-of-the-art building and museum in addition to digital-arts exhibitions and interventions throughout the city. In 2011, CERN was the major collaborative partner and inspiration for the festival, which was called “Origin” and attracted more than 70,000 visitors from 33 countries. A symposium explored the importance of fundamental research and CERN’s collaborative international organizational structure. Even the logo for the festival was taken from the collisions in the ATLAS detector. CERN’s director of research and innovation, Sergio Bertolucci, and the director-general both spoke at the festival, and researchers from the experiments at the LHC gave the public “walk and talk through” guides to the innards of the detectors, with extraordinary high-resolution images.
That was not all. Ars Electronica and CERN also announced at the festival a landmark, three-year international cultural partnership with the launch of the annual Prix Ars Electronica Collide@CERN award for digital artists. The prize is a residency at both institutions lasting three months – two months at CERN for inspiration and one month at Ars Electronica for production. The first competition attracted 395 artists from 40 countries – from Azerbaijan and Uzbekistan, Brazil and Iceland, as well as from across Europe and the US. The winning artist was the 28-year-old Julius von Bismarck – one of the rising stars of the international arts scene, who is currently studying with the celebrated Icelandic Danish artist Olafur Eliasson at the Institute of Spatial Experiments in Berlin.
It was only after awarding von Bismarck the prize that the jury discovered that he had wanted to be a physicist, and that both his brother and his grandfather are physicists. This only goes to prove the point at the heart of the Arts@CERN initiative – that scientists and artists are inter-related. He has just completed his residency of two months at CERN, being inspired by the science and the environment and having been “matched” with James Wells, a theorist at CERN, as his partner for scientific inspiration.
During his time at the laboratory, von Bismarck carried out interventions in perception among the CERN community and held many informal discussions. He is now at Ars Electronica’s transdisciplinary innovation and research facility, Futurelab, producing the ideas generated at CERN. He is working with his production mentor Horst Hoertner – one of the co-founders of the Prix Ars Electronica Collide@CERN. He will showcase the work at this year’s Ars Electronica Festival before bringing the piece back to CERN for a lecture on 25 September. However, the ripples of the residency and the ideas will continue long after von Bismarck has left. As he stated after just two weeks at the laboratory: “This experience is changing my life.”
If this arts experiment sounds easy, it isn’t. As with any experiment, it needs expertise and knowledge to make it happen and to build it, using foundation and structure. So I created for CERN its first arts policy, “Great Arts for Great Science”, putting the arts on the same level of selected excellence as the science to create truly meaningful, high impact-quality engagement, mutual understanding and respect between the arts and science. The first CERN Cultural Board was appointed at this high level of knowledge and excellence – to build expertise in the arts into CERN. The board members, honorary appointments for three years, are recognized leaders in their fields. They include the director-general of the Lyon Opera House, Serge Dorny, and the director of Zurich’s Kunsthalle, Beatrix Ruf, who is acknowledged as one of the most influential figures in contemporary art today. All of the board members donate their time and, crucially, the board also includes a CERN physicist, Michael Doser. Researchers from CERN are also on the juries for all of the artists’ residencies awards.
Every year, the board will select at least one major arts project in which CERN officially collaborates, its stamp of approval enabling the project to find external funding. In 2012–2013, the selected project is the cutting-edge, multimedia/dance/opera/film Symmetry, by a truly international team of artists performing across several art forms, including the soprano Claron McFadden and the Nederlands Dans Theater dancer, Lukáš Timulak. The project is the brainchild of the emerging film director, Ruben Van Leer.
So, that is step one of building a p(art)icle collider – create the policy and the structure. The other steps were to: create the flagship Collide@CERN residency scheme; launch a website to make the work, visits and potential involvement with CERN of artists (past, present and future) visible and accessible; and finally give back to the CERN community by advising on home-grown initiatives that have international artistic potential. In 2010, one of my first acts was to carry out a major strategic review of the home-grown, biannual film festival CinéGlobe, created by CERN’s Open Your Eyes film club. The review recommended developing the brand, mission, vision and values, as well as substantial organizational restructuring and planning. I also suggested the slogan “Inspired by Science” to sum up the festival’s mission.
Two years since being hired by CERN, I am still there. It is the positive spirit of fundamental research – the quest to expand human knowledge and understanding for the good of all, engaging with cutting-edge ideas and technologies – that inspires me to work at CERN, as well as being the source of inspiration for artists. After all, landmark moments of science in the 20th century created some of the most significant arts movements of the modern world. My personal belief is that particle physics combines the twin souls of the artist – the theorist who thinks beyond the paradigms and the experimentalist who tests the new and brings them down to Earth. By building a p(art)icle collider, creative collisions between arts and science have truly begun at CERN.
Friday, 31 May 2001, 6 p.m. – Back in my office, I open my notebook and write “My understanding of MD’s ideas” in blue ink. I draw a box and write the words “Open Lab” in the middle of it. I’ve just left the office of Manuel Delfino, the head of CERN’s IT division. His assistant had called to ask me to go and see Manuel at 4 p.m. to talk about “industrial relations”. I’ve been technology-transfer co-ordinator for a few weeks but I had no idea of what he was going to say to me. An hour later, I need to collect my thoughts. Manuel has just set out one of the most amazing plans I’ve ever seen. There’s nothing like it, no model to go on, and yet the ideas are simple and the vision is clear. He’s asked me to take care of it. The CERN openlab adventure is about to begin.
This is how the opening lines of the openlab story could begin if it were ever to be written as a novel. At the start of the millennium, the case was clear for Manuel Delfino: CERN was in the process of developing the computing infrastructure for the LHC; significant research and development was needed; and advanced solutions and technologies had to be evaluated. His idea was that, although CERN had substantial computing resources and a sound R&D tradition, collaborating with industry would make it possible to do more and do it better.
Four basic principles
CERN was no stranger to collaboration with industry, and I pointed out to Manuel that we had always done field tests on the latest systems in conjunction with their developers. He nodded but stressed that here was the difference: what he was proposing was not a random collection of short-term, independent tests governed by various different agreements. Instead, the four basic principles of openlab would be as follows (I jotted them down carefully because Manuel wasn’t using notes): first, openlab should use a common framework for all partnerships, meaning that the same duration and the same level of contribution should apply to everyone; second, openlab should focus on long-term partnerships of up to three years; third, openlab should target the major market players, with the minimum contribution threshold set at a significant level; last, in return CERN would contribute its expertise, evaluation capacity and its unique requirements. Industrial partners would contribute in kind – in the form of equipment and support – and in cash by funding young people working on joint projects. Ten years on, openlab is still governed by these same four principles.
Back to May 2001. After paving the way with extensive political discussions over several months, Manuel had written a formal letter to five large companies, Enterasys, IBM, Intel, Oracle and KPN QWest, inviting them to become the founding members of the Open Lab (renamed “openlab” a few months later). These letters, which were adapted to suit each case, are model sales-pitches worthy of a professional fundraiser. They set out the unprecedented computing challenges associated with the LHC, the unique opportunities of a partnership with CERN in the LHC framework, the potential benefits for each party and proposed clear areas of technical collaboration for each partner. The letters also demanded a rapid response, indicating that replies needed to reach CERN’s director-general just six weeks later, by 15 June. A model application letter was also provided. With the director-general’s approval, Manuel wrote directly to the top management of the companies concerned, i.e. their chairs and vice-chairs. The letters had the desired effect: three companies gave a positive response by the 15 June deadline, while the other two followed suit a few months later – openlab was ready to go.
The first task was to define the common framework. CERN’s legal service was brought in and the guiding principles of openlab, drawn up in the form of a public document and not as a contract, were ready by the end of 2001. The document was designed to serve as the basis for the detailed agreements with individual partners, which now had to be concluded.
Three-year phases
At the start of 2002, after a few months of existence, openlab had three partners: Enterasys, Intel and KPN QWest (which later withdrew when it became a casualty of the bursting of the telecoms and dotcom bubbles). On 11 March, the first meeting of the board of sponsors was held at CERN. Chaired by the then director-general, Luciano Maiani, representatives of the industrial companies were in attendance as well as Manuel, Les Robertson (the head of the LHC Computing Grid project) and me. At the meeting I presented the first openlab annual report, which has since been followed by nine more, each printed in more than 1000 copies. Then, in July, openlab was joined by HP, and subsequently followed by IBM in March 2003 and by Oracle in October 2003.
In the meantime, a steering structure for openlab was set up at CERN in early 2003, headed by the new head of the IT Department, Wolfgang von Rüden, in an ex officio capacity. Sverre Jarp was the chief technical officer, while François Grey was in charge of communication and I was to co-ordinate the overall management. January 2003 was also a good opportunity to resynchronize the partnerships. The concept of three-year “openlab phases” was adopted, the first covering the years 2003–2005. Management practices and the technical focus would be reviewed and adapted through the successive phases.
Thus, Phase I began with an innovative and ambitious technical objective: each partnership was to form a building block of a common structure so that all of the projects would be closely linked. This common construction, which we were all building together, was called “opencluster”. It was an innovative and ambitious idea – but unfortunately too ambitious. The constraints ultimately proved too restrictive – both for the existing projects and for bringing in new partners. So what of a new unifying structure to replace opencluster? The idea was eventually abandoned when it came to openlab-II: although the search for synergies between individual projects was by no means excluded, it was no longer an obligation.
A further adjustment occurred in the meantime, in the shape of a new and complementary type of partnership: the status of “contributor” was created in January 2004, aimed at tactical, shorter-term collaborations focusing on a specific technology. Voltaire was the first company to acquire the new status on 2 April, to provide CERN with the first high-speed network based on Infiniband technology. A further innovation followed in July. François set up the openlab Student Programme, designed to bring students to CERN from around the world to work on openlab projects. With the discontinuation of the opencluster concept, and with the new contributor status and the student programme, openlab had emphatically demonstrated its ability to adapt and progress. The second phase, openlab-II, began in January 2006, with Intel, Oracle and HP as partners and the security-software companies Stonesoft and F-Secure as contributors. They were joined in March 2007 by EDS, a giant of the IT-services industry, which contributed to the monitoring tools needed for the Grid computing system being developed for the LHC.
The year 2007 also saw a technical development that was to prove crucial for the future of openlab. At the instigation of Jean-Michel Jouanigot of the network group, CERN and HP ProCurve pioneered a new joint-research partnership. So far, projects had essentially focused on the evaluation and integration of technologies proposed by the partners from industry. In this case, CERN and HP ProCurve were to undertake joint design and development work. The openlab’s hallmark motto, “You make it, we break it”, was joined by a new slogan, “We make it together”. Another major event followed in September 2008 when Wolfgang’s patient, months-long discussions with Siemens culminated in the company becoming a openlab partner. Thus, by the end of Phase II, openlab had entered the world of control systems.
At the start of openlab-III in 2009, Intel, Oracle and HP were joined by Siemens. EDS also decided to extend its partnership by one year. This third phase was characterized by a marked increase in education and communication efforts. More and more workshops were organized on specific themes – particularly in the framework of collaboration with Intel – and the communication structure was reorganized. The post of openlab communications officer, directly attached to the openlab manager, was created in the summer of 2008. A specific programme was drawn up with each partner and tools for monitoring spin-offs were implemented.
Everything was therefore in place for the next phase, which Wolfgang enthusiastically started to prepare at the end of 2010. In May 2011, in agreement with Frédéric Hemmer, who had taken over as head of the IT Department in 2009, he handed over the reins to Bob Jones. The fourth phase of openlab began in January 2012 with not only HP, Intel and Oracle as partners, but also with Chinese multinational Huawei, whose arrival extended openlab’s technical scope to include storage technologies.
After 10 years of existence, the basic principles of openlab still hold true and its long-standing partners are still present. While I, too, passed on the baton at the start of 2012, the openlab adventure is by no means over.
It all started a year ago over dinner with a good bottle of wine in front of us. Steve Gourlay of Lawrence Berkeley National Laboratory, Stuart Henderson of Fermilab and myself talked about the future of accelerator R&D in the US and what could be done to promote it.
We had no idea that an opportunity would present itself so quickly, that it would require such fast action or that blogging would be a central part of carrying out our mission.
A 2009 symposium called “Accelerators for America’s Future” had laid out some of the issues and obstacles, and in September 2011 the US Senate Committee on Appropriations asked the US Department of Energy (DOE) to submit a strategic plan for accelerator R&D by June 2012.
The DOE asked me to lead a task force to develop ideas about this important matter: what should the DOE do, over the next 10 years, to streamline the transfer of accelerator R&D so that its benefits could spread out into the larger society?
We were ready to go by October. The task force would have until 1 February 2012 – just four months – to identify research opportunities targeted to applications, estimate their costs and outline the possible impediments to carrying out such a plan. Based on this information, DOE officials would draw up their strategic plan in time for the congressional deadline.
It was a huge job. The 15 members of the task force, who hailed from six DOE national laboratories, industry, universities, DOE headquarters and the National Science Foundation, would need to gather facts, opinions and ideas from a range of people with a stake in this issue – from basic researchers at the national laboratories to university and industry scientists, entrepreneurs, inventors, regulators, industry leaders, defence agencies and owners of businesses both small and large.
We quickly held a workshop in Washington, DC, followed by others at the Argonne and Lawrence Berkeley National Laboratories, where we presented some of the major ideas. And to gather the most feedback from the most people in the shortest amount of time, I did something that I like to do: I started a blog.
Now, anyone who has been around high-energy physics for a while knows that blogs and other forms of cutting-edge social media are nothing new. We particle physicists, after all, started the World Wide Web as a way to share our ideas, and what became known as the arXiv to distribute preprints of our research results. Many physicists are avid bloggers, and a number of laboratories – from CERN to Fermilab and KEK – operate blogs of their own; you can see a sample of these blogs at www.quantumdiaries.org. But it’s not as usual to incorporate a blog into the work of a task force – although, for the life of me, I don’t know why you would not want to do it.
One of the first things that I did when I came to SLAC two years ago was to start a blog aimed at fostering communication among people in the Accelerator Directorate. A blog is a great way to talk about topics that are burning under our fingernails – although sometimes one needs to overcome a certain amount of cultural resistance to get people talking freely. Instead of filling various inboxes with chains of e-mails, “electronic blackboards” are easy to read and easy to post on, and they even have the added convenience of notifying you when a new post goes up.
In the good old days you could have everyone come to one place and have a panel discussion or an all-hands meeting – an easy, free-flowing exchange of ideas. A blog can be just such a thing: open and inviting.
Our task force invited literally thousands of people to comment on the issues at hand. What can be done to move the fruits of basic accelerator research and development more quickly into medicine, energy development, environmental cleanup, industry, defence and national security? What good could flow from such a movement? What are the barriers – especially between the national laboratories, where most of this research is done, and the industries that could develop it into products – and how can they be overcome?
Not everyone answered, but many did. More than half of the responses that we got came in through the blog rather than as e-mail messages. Within a couple of days it became clear just from the people who blogged that the medical community is starving for facilities and infrastructure to develop radiation therapy further, mainly with heavy-ion beams. The people talked to us and among themselves. So it’s no surprise that the report we write will describe opportunities for the DOE to make its infrastructure available for researchers who want to pursue this line of work.
Others talked about the difficulties that they had in working with government agencies or national laboratories and how this could be made easier – a worthwhile read during an easy afternoon.
So, blogging is not just fun; it’s a great way to gather information and encourage dialogue. Once our task force finalizes its report, the site will be up for a while, and then, when the next issue arises, the blackboard will get cleaned and I will start a new one.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.