Comsol -leaderboard other pages

Topics

L’enigma dei raggi cosmici. Le più grandi energie dell’ universo

By Alessandro De Angelis
Springer
Paperback: £19.99 €24.44

CCboo1_04_12

In telling the story of “the enigma of cosmic rays”, physicist and enthusiastic communicator Alessandro De Angelis traces the fascinating adventure of cosmic rays since their discovery a century ago. Today, the exploration of the mysteries of cosmic rays continues with even more powerful tools in a range of energies that extends 20 orders of magnitude.

Cosmic rays have always been puzzling. In the first decade of the 20th century, physicists were seeking a solution to the problem of why gold-leaf electroscopes – instruments that are still common in laboratories in schools today – discharge spontaneously. Many scientists faced this problem, including an Italian, Domenico Pacini, who made some important measurements by immersing his instruments under water at different depths and observing a marked decrease in the discharge rate. Indeed, Pacini was the first to give a clear indication that part of the natural radiation he detected came from the atmosphere and from the cosmos. However, his results were published only in Italian and had no great prominence – although Viktor Hess did mention Pacini several times in his speech when he obtained the Nobel Prize in Physics for the discovery of cosmic rays. Pacini’s work is yet another glaring example of a discovery that has not obtained the international recognition it deserves.

The riddles of cosmic rays do not end there. We still do not know for sure where they come from. They are deflected by the interstellar magnetic field so their direction of arrival cannot be connected to their starting point. Above all, we still struggle to understand what mechanism provides them with an energy that can in extreme cases reach the energy of a tennis ball concentrated in a single atomic nucleus. Enrico Fermi proposed a theory for the acceleration of cosmic rays that explains in part what is observed. However, there is still much to understand and we hope that recent and future results in high-energy astrophysics will be able to answer this fundamental question.

What is sure is that cosmic rays bring to the Earth pieces of the far-away universe. Furthermore, their high energy makes them interact with the atmosphere, producing secondary particles – as in powerful particle accelerators. For this reason, in the first half of the past century cosmic rays revealed the first particle of antimatter – the positron – and many new particles that led to the birth of elementary particle physics before accelerators made by humans turned it into a mature science. Even today, in the LHC era, the study of high-energy cosmic rays and the precision testing of their composition at intermediate energies are active fields of research, with experiments on Earth and in space. In particular the first evidence of neutrino oscillations – and thus of their mass – was observed by studying the secondary neutrinos produced by cosmic rays in the atmosphere.

This book by De Angelis traces the history of the study of cosmic rays in a documented, comprehensive way, often providing details both interesting and little known. It is easily readable and an excellent reference for anyone interested in fundamental physics and contemporary astrophysics.

The Universe: A Challenge to the Mind

By Jacques Vanier
Imperial College Press
Hardback: £74 $120
Paperback: £33 $54

71bcEAp5OdL

In this book, Jacques Vanier gives a comprehensive picture of the physical laws that appear to regulate the functioning of the universe, from the atomic to the cosmic world. It offers a description of the main fields of physics as applied to the atomic world and the cosmos, to describe how the universe evolved to its present state. This is done without equations, except for a few, although there is a short annexe for readers who wish to see how the principles and laws expressed in words can be visualized in the language of mathematics. The author also occasionally uses two young people placed in various situations to explain aspects of physics through their observations.

How to build a p(art)icle collider

CCart1_04_12

Take a 27-km, record-breaking machine, with 10,000 scientists from 100 countries and 630 institutions, throw in selected artists and arts specialists, and what do you get? An experiment to bring about head-on collisions between things that are even more elusive than the Higgs Boson – creativity, imagination and human ingenuity. Without them, science, art and technology would not exist. The name of this experiment is Arts@CERN, and last year saw the switch-on of this new and rather different collider at CERN.

The start-up has seen CERN collaborate in the world’s most prestigious digital-arts festival, Ars Electronica, in Linz; feature in the keynote event at the Agenda 2016 conference at Moderna Museet, in Stockholm; supply live footage from the LHC to the US film director David Lynch for the Mathematics exhibition at one of the world’s leading contemporary arts museums, Fondation Cartier, in Paris; and have its research into antimatter feature on the centre spread of China’s best-selling design magazine.

Other results of the arts switch-on involve specially curated visits to CERN’s facilities for leading international artists. Recently these included the Swiss video artist Pipilotti Rist, the Polish conceptual artist Goshka Mocuga and the master of contemporary dance, the US choreographer William Forsythe, as well as up-and-coming young artists, such as performer Niamh Shaw from Ireland. And to cap it all, this year CERN has two artists in residence on the new, three-year international artists’ residency programme, Collide@CERN, which is funded and supported by external donors and partners.

This all seems a long time since 2009, when I was given the opportunity to go anywhere in the world after I received the Clore Fellowship – an award for cultural leadership. Instead of taking the opportunity to work in a famous arts organization, I decided to approach CERN to come for three months, supported by the UK Government who funded my award, to carry out a feasibility study for an artists’ residency scheme. Little did I know that I would be hired in the spring of 2010 to build a p(art)icle collider for CERN.

Making us human

CCart3_04_12

So why should CERN engage with the arts? CERN has a mission to engage science in society. The arts reach areas that science and technology alone cannot reach – touching the public who might otherwise be turned off. By joining forces, arts, technology and science make an unbeatable force for change and innovation in the 21st century, as Eric Schmidt, now executive chair of Google, points out. In the words of CERN’s director-general, Rolf Heuer: “They are expressions of what makes us human in the world.”

This phrase, more than any other, shows what is behind CERN’s high-level engagement with the arts and can be summed up in a simple equation: arts + science + technology = culture. For an organization to be truly cultural and innovative in the 21st century, it has to embrace all factors and facets of human experience, engaging with them on the same level of excellence as its institutional values.

Science and the arts are intimately connected in other ways, too. The British sculptor Antony Gormley is one of several leading international artists who are the patrons of the Collide@CERN artist in residence scheme. He recently donated one of his pieces, Feeling Material, to CERN in acknowledgement of the inspiration of particle physics on his work; it now hangs in the Main Building. Gormley is clear about the connection between art and science: “My whole philosophy is that art and science are better together than apart. We have somehow accepted an absolute division between analysis and intuition but I think actually the structures that they both come up with are an intricate mix of the two.”

The showpiece event that signalled the switch-on of CERN’s arts experiment was the six-day Ars Electronica Festival in Linz in 2011. Being the world’s leading digital-arts festival, it features spectacular performances in and around its state-of-the-art building and museum in addition to digital-arts exhibitions and interventions throughout the city. In 2011, CERN was the major collaborative partner and inspiration for the festival, which was called “Origin” and attracted more than 70,000 visitors from 33 countries. A symposium explored the importance of fundamental research and CERN’s collaborative international organizational structure. Even the logo for the festival was taken from the collisions in the ATLAS detector. CERN’s director of research and innovation, Sergio Bertolucci, and the director-general both spoke at the festival, and researchers from the experiments at the LHC gave the public “walk and talk through” guides to the innards of the detectors, with extraordinary high-resolution images.

CCart4_04_12

That was not all. Ars Electronica and CERN also announced at the festival a landmark, three-year international cultural partnership with the launch of the annual Prix Ars Electronica Collide@CERN award for digital artists. The prize is a residency at both institutions lasting three months – two months at CERN for inspiration and one month at Ars Electronica for production. The first competition attracted 395 artists from 40 countries – from Azerbaijan and Uzbekistan, Brazil and Iceland, as well as from across Europe and the US. The winning artist was the 28-year-old Julius von Bismarck – one of the rising stars of the international arts scene, who is currently studying with the celebrated Icelandic Danish artist Olafur Eliasson at the Institute of Spatial Experiments in Berlin.

It was only after awarding von Bismarck the prize that the jury discovered that he had wanted to be a physicist, and that both his brother and his grandfather are physicists. This only goes to prove the point at the heart of the Arts@CERN initiative – that scientists and artists are inter-related. He has just completed his residency of two months at CERN, being inspired by the science and the environment and having been “matched” with James Wells, a theorist at CERN, as his partner for scientific inspiration.

During his time at the laboratory, von Bismarck carried out interventions in perception among the CERN community and held many informal discussions. He is now at Ars Electronica’s transdisciplinary innovation and research facility, Futurelab, producing the ideas generated at CERN. He is working with his production mentor Horst Hoertner – one of the co-founders of the Prix Ars Electronica Collide@CERN. He will showcase the work at this year’s Ars Electronica Festival before bringing the piece back to CERN for a lecture on 25 September. However, the ripples of the residency and the ideas will continue long after von Bismarck has left. As he stated after just two weeks at the laboratory: “This experience is changing my life.”

A policy for arts

CCart5_04_12

If this arts experiment sounds easy, it isn’t. As with any experiment, it needs expertise and knowledge to make it happen and to build it, using foundation and structure. So I created for CERN its first arts policy, “Great Arts for Great Science”, putting the arts on the same level of selected excellence as the science to create truly meaningful, high impact-quality engagement, mutual understanding and respect between the arts and science. The first CERN Cultural Board was appointed at this high level of knowledge and excellence – to build expertise in the arts into CERN. The board members, honorary appointments for three years, are recognized leaders in their fields. They include the director-general of the Lyon Opera House, Serge Dorny, and the director of Zurich’s Kunsthalle, Beatrix Ruf, who is acknowledged as one of the most influential figures in contemporary art today. All of the board members donate their time and, crucially, the board also includes a CERN physicist, Michael Doser. Researchers from CERN are also on the juries for all of the artists’ residencies awards.

Every year, the board will select at least one major arts project in which CERN officially collaborates, its stamp of approval enabling the project to find external funding. In 2012–2013, the selected project is the cutting-edge, multimedia/dance/opera/film Symmetry, by a truly international team of artists performing across several art forms, including the soprano Claron McFadden and the Nederlands Dans Theater dancer, Lukáš Timulak. The project is the brainchild of the emerging film director, Ruben Van Leer.

So, that is step one of building a p(art)icle collider – create the policy and the structure. The other steps were to: create the flagship Collide@CERN residency scheme; launch a website to make the work, visits and potential involvement with CERN of artists (past, present and future) visible and accessible; and finally give back to the CERN community by advising on home-grown initiatives that have international artistic potential. In 2010, one of my first acts was to carry out a major strategic review of the home-grown, biannual film festival CinéGlobe, created by CERN’s Open Your Eyes film club. The review recommended developing the brand, mission, vision and values, as well as substantial organizational restructuring and planning. I also suggested the slogan “Inspired by Science” to sum up the festival’s mission.

CCart6_04_12

Two years since being hired by CERN, I am still there. It is the positive spirit of fundamental research – the quest to expand human knowledge and understanding for the good of all, engaging with cutting-edge ideas and technologies – that inspires me to work at CERN, as well as being the source of inspiration for artists. After all, landmark moments of science in the 20th century created some of the most significant arts movements of the modern world. My personal belief is that particle physics combines the twin souls of the artist – the theorist who thinks beyond the paradigms and the experimentalist who tests the new and brings them down to Earth. By building a p(art)icle collider, creative collisions between arts and science have truly begun at CERN.

• For more information see: Arts at CERN website, www.cern.ch/arts; and the first Prix Ars Electronica Collide@CERN lecture by Julius von Bismarck, http://cdsweb.cern.ch/record/1433816?ln=en.

The openlab adventure continues to thrive

Friday, 31 May 2001, 6 p.m. – Back in my office, I open my notebook and write “My understanding of MD’s ideas” in blue ink. I draw a box and write the words “Open Lab” in the middle of it. I’ve just left the office of Manuel Delfino, the head of CERN’s IT division. His assistant had called to ask me to go and see Manuel at 4 p.m. to talk about “industrial relations”. I’ve been technology-transfer co-ordinator for a few weeks but I had no idea of what he was going to say to me. An hour later, I need to collect my thoughts. Manuel has just set out one of the most amazing plans I’ve ever seen. There’s nothing like it, no model to go on, and yet the ideas are simple and the vision is clear. He’s asked me to take care of it. The CERN openlab adventure is about to begin.

CCope1_04_12

This is how the opening lines of the openlab story could begin if it were ever to be written as a novel. At the start of the millennium, the case was clear for Manuel Delfino: CERN was in the process of developing the computing infrastructure for the LHC; significant research and development was needed; and advanced solutions and technologies had to be evaluated. His idea was that, although CERN had substantial computing resources and a sound R&D tradition, collaborating with industry would make it possible to do more and do it better.

Four basic principles

CERN was no stranger to collaboration with industry, and I pointed out to Manuel that we had always done field tests on the latest systems in conjunction with their developers. He nodded but stressed that here was the difference: what he was proposing was not a random collection of short-term, independent tests governed by various different agreements. Instead, the four basic principles of openlab would be as follows (I jotted them down carefully because Manuel wasn’t using notes): first, openlab should use a common framework for all partnerships, meaning that the same duration and the same level of contribution should apply to everyone; second, openlab should focus on long-term partnerships of up to three years; third, openlab should target the major market players, with the minimum contribution threshold set at a significant level; last, in return CERN would contribute its expertise, evaluation capacity and its unique requirements. Industrial partners would contribute in kind – in the form of equipment and support – and in cash by funding young people working on joint projects. Ten years on, openlab is still governed by these same four principles.

CCope2_04_12

Back to May 2001. After paving the way with extensive political discussions over several months, Manuel had written a formal letter to five large companies, Enterasys, IBM, Intel, Oracle and KPN QWest, inviting them to become the founding members of the Open Lab (renamed “openlab” a few months later). These letters, which were adapted to suit each case, are model sales-pitches worthy of a professional fundraiser. They set out the unprecedented computing challenges associated with the LHC, the unique opportunities of a partnership with CERN in the LHC framework, the potential benefits for each party and proposed clear areas of technical collaboration for each partner. The letters also demanded a rapid response, indicating that replies needed to reach CERN’s director-general just six weeks later, by 15 June. A model application letter was also provided. With the director-general’s approval, Manuel wrote directly to the top management of the companies concerned, i.e. their chairs and vice-chairs. The letters had the desired effect: three companies gave a positive response by the 15 June deadline, while the other two followed suit a few months later – openlab was ready to go.

The first task was to define the common framework. CERN’s legal service was brought in and the guiding principles of openlab, drawn up in the form of a public document and not as a contract, were ready by the end of 2001. The document was designed to serve as the basis for the detailed agreements with individual partners, which now had to be concluded.

Three-year phases

At the start of 2002, after a few months of existence, openlab had three partners: Enterasys, Intel and KPN QWest (which later withdrew when it became a casualty of the bursting of the telecoms and dotcom bubbles). On 11 March, the first meeting of the board of sponsors was held at CERN. Chaired by the then director-general, Luciano Maiani, representatives of the industrial companies were in attendance as well as Manuel, Les Robertson (the head of the LHC Computing Grid project) and me. At the meeting I presented the first openlab annual report, which has since been followed by nine more, each printed in more than 1000 copies. Then, in July, openlab was joined by HP, and subsequently followed by IBM in March 2003 and by Oracle in October 2003.

In the meantime, a steering structure for openlab was set up at CERN in early 2003, headed by the new head of the IT Department, Wolfgang von Rüden, in an ex officio capacity. Sverre Jarp was the chief technical officer, while François Grey was in charge of communication and I was to co-ordinate the overall management. January 2003 was also a good opportunity to resynchronize the partnerships. The concept of three-year “openlab phases” was adopted, the first covering the years 2003–2005. Management practices and the technical focus would be reviewed and adapted through the successive phases.

CCope3_04_12

Thus, Phase I began with an innovative and ambitious technical objective: each partnership was to form a building block of a common structure so that all of the projects would be closely linked. This common construction, which we were all building together, was called “opencluster”. It was an innovative and ambitious idea – but unfortunately too ambitious. The constraints ultimately proved too restrictive – both for the existing projects and for bringing in new partners. So what of a new unifying structure to replace opencluster? The idea was eventually abandoned when it came to openlab-II: although the search for synergies between individual projects was by no means excluded, it was no longer an obligation.

A further adjustment occurred in the meantime, in the shape of a new and complementary type of partnership: the status of “contributor” was created in January 2004, aimed at tactical, shorter-term collaborations focusing on a specific technology. Voltaire was the first company to acquire the new status on 2 April, to provide CERN with the first high-speed network based on Infiniband technology. A further innovation followed in July. François set up the openlab Student Programme, designed to bring students to CERN from around the world to work on openlab projects. With the discontinuation of the opencluster concept, and with the new contributor status and the student programme, openlab had emphatically demonstrated its ability to adapt and progress. The second phase, openlab-II, began in January 2006, with Intel, Oracle and HP as partners and the security-software companies Stonesoft and F-Secure as contributors. They were joined in March 2007 by EDS, a giant of the IT-services industry, which contributed to the monitoring tools needed for the Grid computing system being developed for the LHC.

The year 2007 also saw a technical development that was to prove crucial for the future of openlab. At the instigation of Jean-Michel Jouanigot of the network group, CERN and HP ProCurve pioneered a new joint-research partnership. So far, projects had essentially focused on the evaluation and integration of technologies proposed by the partners from industry. In this case, CERN and HP ProCurve were to undertake joint design and development work. The openlab’s hallmark motto, “You make it, we break it”, was joined by a new slogan, “We make it together”. Another major event followed in September 2008 when Wolfgang’s patient, months-long discussions with Siemens culminated in the company becoming a openlab partner. Thus, by the end of Phase II, openlab had entered the world of control systems.

At the start of openlab-III in 2009, Intel, Oracle and HP were joined by Siemens. EDS also decided to extend its partnership by one year. This third phase was characterized by a marked increase in education and communication efforts. More and more workshops were organized on specific themes – particularly in the framework of collaboration with Intel – and the communication structure was reorganized. The post of openlab communications officer, directly attached to the openlab manager, was created in the summer of 2008. A specific programme was drawn up with each partner and tools for monitoring spin-offs were implemented.

Everything was therefore in place for the next phase, which Wolfgang enthusiastically started to prepare at the end of 2010. In May 2011, in agreement with Frédéric Hemmer, who had taken over as head of the IT Department in 2009, he handed over the reins to Bob Jones. The fourth phase of openlab began in January 2012 with not only HP, Intel and Oracle as partners, but also with Chinese multinational Huawei, whose arrival extended openlab’s technical scope to include storage technologies.

After 10 years of existence, the basic principles of openlab still hold true and its long-standing partners are still present. While I, too, passed on the baton at the start of 2012, the openlab adventure is by no means over.

• For a version of this article in French, see https://cern.ch/Fluckiger/Articles/F.Fluckiger-openlab-10_ans_deja.pdf.

It’s good to blog

CCvie1_04_12

It all started a year ago over dinner with a good bottle of wine in front of us. Steve Gourlay of Lawrence Berkeley National Laboratory, Stuart Henderson of Fermilab and myself talked about the future of accelerator R&D in the US and what could be done to promote it.

We had no idea that an opportunity would present itself so quickly, that it would require such fast action or that blogging would be a central part of carrying out our mission.

A 2009 symposium called “Accelerators for America’s Future” had laid out some of the issues and obstacles, and in September 2011 the US Senate Committee on Appropriations asked the US Department of Energy (DOE) to submit a strategic plan for accelerator R&D by June 2012.

The DOE asked me to lead a task force to develop ideas about this important matter: what should the DOE do, over the next 10 years, to streamline the transfer of accelerator R&D so that its benefits could spread out into the larger society?

We were ready to go by October. The task force would have until 1 February 2012 – just four months – to identify research opportunities targeted to applications, estimate their costs and outline the possible impediments to carrying out such a plan. Based on this information, DOE officials would draw up their strategic plan in time for the congressional deadline.

It was a huge job. The 15 members of the task force, who hailed from six DOE national laboratories, industry, universities, DOE headquarters and the National Science Foundation, would need to gather facts, opinions and ideas from a range of people with a stake in this issue – from basic researchers at the national laboratories to university and industry scientists, entrepreneurs, inventors, regulators, industry leaders, defence agencies and owners of businesses both small and large.

We quickly held a workshop in Washington, DC, followed by others at the Argonne and Lawrence Berkeley National Laboratories, where we presented some of the major ideas. And to gather the most feedback from the most people in the shortest amount of time, I did something that I like to do: I started a blog.

Now, anyone who has been around high-energy physics for a while knows that blogs and other forms of cutting-edge social media are nothing new. We particle physicists, after all, started the World Wide Web as a way to share our ideas, and what became known as the arXiv to distribute preprints of our research results. Many physicists are avid bloggers, and a number of laboratories – from CERN to Fermilab and KEK – operate blogs of their own; you can see a sample of these blogs at www.quantumdiaries.org. But it’s not as usual to incorporate a blog into the work of a task force – although, for the life of me, I don’t know why you would not want to do it.

One of the first things that I did when I came to SLAC two years ago was to start a blog aimed at fostering communication among people in the Accelerator Directorate. A blog is a great way to talk about topics that are burning under our fingernails – although sometimes one needs to overcome a certain amount of cultural resistance to get people talking freely. Instead of filling various inboxes with chains of e-mails, “electronic blackboards” are easy to read and easy to post on, and they even have the added convenience of notifying you when a new post goes up.

In the good old days you could have everyone come to one place and have a panel discussion or an all-hands meeting – an easy, free-flowing exchange of ideas. A blog can be just such a thing: open and inviting.

Our task force invited literally thousands of people to comment on the issues at hand. What can be done to move the fruits of basic accelerator research and development more quickly into medicine, energy development, environmental cleanup, industry, defence and national security? What good could flow from such a movement? What are the barriers – especially between the national laboratories, where most of this research is done, and the industries that could develop it into products – and how can they be overcome?

Not everyone answered, but many did. More than half of the responses that we got came in through the blog rather than as e-mail messages. Within a couple of days it became clear just from the people who blogged that the medical community is starving for facilities and infrastructure to develop radiation therapy further, mainly with heavy-ion beams. The people talked to us and among themselves. So it’s no surprise that the report we write will describe opportunities for the DOE to make its infrastructure available for researchers who want to pursue this line of work.

Others talked about the difficulties that they had in working with government agencies or national laboratories and how this could be made easier – a worthwhile read during an easy afternoon.

So, blogging is not just fun; it’s a great way to gather information and encourage dialogue. Once our task force finalizes its report, the site will be up for a while, and then, when the next issue arises, the blackboard will get cleaned and I will start a new one.

• To see the blog, go to https://slacportal.slac.stanford.edu/sites/ad_public/committees/Acc_RandD_TF_Blog/default.aspx.

US high-energy physics faces budget cuts

On 13 February, US President Barack Obama unveiled his administration’s budget request for fiscal year 2013, which begins on 1 October 2012. The budget for the Department of Energy’s Office of Science would increase by 2.4 per cent to $4.992 billion, but high-energy physics would be reduced by 1.8 per cent to $777 million. In the next step, the two chambers of the US Congress will take up the negotiations to arrive at a final budget.

The proposed cuts in high-energy physics would hit two long-term programmes the hardest: the Long-Baseline Neutrino Experiment (LBNE) and the US R&D programme for the International Linear Collider (ILC). The budget for LBNE would drop to $10 million from $21 million in the current year. The collaboration had requested an increase to advance its plans to search for CP-violation in neutrino interactions by sending neutrinos from Fermilab to a detector in South Dakota (Steps forward for new long-baseline experiment).

Funding for the US ILC R&D programme is eliminated in the request, a cut of $20 million. While the current ILC R&D phase will end this year, the next phase would have helped to advance accelerator technologies that would benefit projects such as Fermilab’s Project X proton accelerator and Berkeley’s Next-Generation Light Source.

Some programmes would fare much better. Funding for non-accelerator physics programmes would increase by $13 million to ramp-up engineering and design efforts for the Large Synoptic Survey Telescope camera project and R&D funding for next-generation dark-matter experiments. The US contribution to the upgrades of the Belle-II detector at KEK in Japan would remain on track, along with near-term neutrino and muon research programmes at Fermilab.

How the CMS collaboration orchestrates its success

CCcms2_03_12

New members of the top-level management talk to Antonella Del Rosso about the CMS model for running a large collaboration, as they prepare for the start of the LHC’s run in 2012.

CCcms1_03_12

Trying to uncover the deepest mysteries of the universe is no trivial task. Today, the scientific collaborations that accept the challenge are huge, complex organizational structures that have their own constitution, strict budget control and top management. CMS, one of two general-purpose experiments that study the LHC collisions, provides a good example of how this type of scientific complexity can be dealt with.

 

The collaboration has literally thousands of heroes

Tiziano Camporesi

The CMS collaboration currently has around 4300 members, with more than 1000 new faces joining in the past three years. Together they come from some 170 institutes in 40 countries and six continents. Each institute has specific tasks to complete, which are agreed with the management leading the collaboration. “The collaboration is evolving all of the time. Every year we receive applications from five or so new institutes that wish to participate in the experiment,” says Joe Incandela of the University of California Santa Barbara and CERN, who took over as spokesperson of the CMS collaboration at the start of 2012. “The Collaboration Board has the task of considering those applications and taking a decision after following the procedures described in the CMS constitution. All of the participating institutes are committed to maintaining, operating, upgrading and exploiting the physics of the detector.”

Once they become full members of the collaboration, all institutes are represented on the Collaboration Board – the true governing body of CMS. (In practice, small institutes join together and choose a common representative.) The representatives can also vote for the spokesperson every two years. “To manage such a complex structure that must achieve very ambitious goals, the collaboration has so far always sought a spokesperson from among those people who have contributed to the experiment in some substantial way over the years and who have demonstrated some managerial and leadership qualities,” notes deputy-spokesperson Tiziano Camporesi of CERN . “We often meet film-makers or journalists who tell us that they want to feature a few people. They want to have ‘stars’ who can be the heroes of the show but we always tell them that the collaboration has literally thousands of heroes. I have often heard it said that we are like an orchestra: the conductor is important but the whole thing only works if every single musician plays well.”

Although two years may seem to be a short term, Joao Varela – who is a professor at the Instituto Superior Técnico of the Technical University of Lisbon and also deputy-spokesperson – believes that there are many positive aspects in changing the top management rather frequently. “The ‘two-years scheme’ allows CMS to grant this prestigious role to more people over time,” he says. “In this way, more institutes and cultures can be represented at such a high level. There is a sense of fairness in the honour being shared across the whole community. Moreover, each time a new person comes in, by human nature he/she is motivated to bring in new ideas.”

As good as the idea is to rotate people in the top management, the CMS collaboration is currently analysing the experience already accumulated to see if things can be improved. “So far deputies have always been elected as spokespersons and this has ensured continuity even during the short overlap. I was myself in physics co-ordination, then deputy and finally spokesperson. Even so, I am learning many new things every day,” points out Incandela.

At CMS the spokesperson also nominates his/her deputies and many of the members of the Executive Board, which brings together project managers and activity co-ordinators. “The members of the Executive Board are responsible for most of the day-to-day co-ordination work that is a big part of what makes CMS work so well,” explains Incandela. “Each member is responsible for managing an organization with large numbers of people and a considerable budget in some cases. Historically, the different projects and activities were somewhat isolated from one another, so that members of the board didn’t really have a chance or need to follow what the other areas were doing. With the start of LHC operations in 2008 this began to change and now people focus on broader issues.” To improve communication among the members of the Executive Board, the new CMS management also decided to organize workshops. “These have turned out to be fantastic events,” says Camporesi. “At the meetings, we discuss important and broad issues openly, from what is the best way to do great physics to how to maintain high morale and attract excellent young people to the collaboration.”

To keep the whole collaboration informed about the outcomes of such strategic meetings and other developments in the experiment in general, the CMS management organizes weekly plenary meetings. “I report once a week to the whole collaboration: we typically have anywhere from 50 to 250 people attending, plus 100–200 remote connections. We are a massive organization and the weekly update is a quick and useful means of keeping everybody informed,” adds Incandela.

The scientific achievements of CMS prove not only that a large scientific collaboration is manageable but also that it is effective. In January this year a new two-year term began for the CMS collaboration, which also renewed all of the members of top management. This is a historic moment for the experiment because many potential discoveries are in the pipeline. “This is my third generation of hadron collider – I participated in the UA2 experiment at CERN’s SPS, CDF at Fermilab’s Tevatron and now CMS at the LHC. When you are proposing a new experiment and then building it, the focus is entirely on the detector,” observes Incandela. “Then, when the beam comes, attention moves rapidly to the data and physics. The collaboration is mainly interested in data and the discoveries that we hope to make. We must ensure the high performance of the detector while providing the means for extremely accurate but quick data analysis. However, although almost everything works perfectly, there are already many small things in the detector that need repairing and upgrading.”

It is obviously important if we discover things. But is also important if we don’t see anything

Joao Valera

The accelerator settings for the LHC’s 2012 run, decided at the Chamonix Workshop in February, will mean that CMS has to operate in conditions that go beyond the design target. “The detector will face tougher pile-up conditions and our teams of experts have been working hard to ensure that all of the subsystems work as expected. It looks like the detector can cope with conditions that are up to 50% higher than the design target”, confirms Camporesi. “Going beyond that could create serious issues for the experiment. We observe that the Level1 trigger starts to be a limitation and the pixel detector starts to lose data, for instance.” CMS is already planning upgrades to improve granularity and trigger performance to cope with the projected higher luminosity beyond 2014.

Going to higher luminosity may be a big technical challenge but it does mean reducing the times to discoveries. “The final word on the Higgs boson is within reach, now measurable in terms of months rather than years. And for supersymmetry, we are changing the strategy. In 2010–2011, we were essentially searching for supersymmetric partners of light quarks because they were potentially more easily accessible. This approach didn’t yield any fruit but put significant constraints on popular models. A lot of people were discouraged,” explains Varela. “However, what we have not ruled out are possible relatively light supersymmetric partners of the third-generation quarks. The third generation is a tougher thing to look for because the signal is smaller and the backgrounds can be higher. By increasing the energy of the collisions to 4 TeV one gains 50–70% in pair production of supersymmetric top, for instance, while the top-pair background rises by a smaller margin. Having said this, and given the unexplored environment, it is obviously important if we discover things. But it is also important if we don’t see anything.”

There is a long road ahead because the searches will continue at higher LHC energies and luminosities after 2014, but the CMS collaboration plans to be well prepared.

Authors and supporters

CCvie1_03_12

The first “high-energy” accelerators were constructed more than 80 years ago. No doubt they represented technological challenges and major achievements even though, seen from a 2012 perspective, the projects involved only a few people and small hardware set-ups. For many of us, making a breakthrough with just a few colleagues and some new equipment feels like a dream from a different era. Nowadays, frontier research in particle physics requires huge infrastructures that thrill the imagination of the general public. While people often grasp only a fraction of the physics at stake, they easily recognize the full extent of the human undertaking. Particle-physics experiments and accelerators are, indeed, miracles of technology and major examples of worldwide co-operation and on-site teamwork.

Looking ahead

Studies on future accelerators and particle-physics experiments at the energy or luminosity frontier now span several decades and involve hundreds, if not thousands, of participants. This means that, while progress is made with the technical developments for a future facility, the physics landscape continues to evolve. The key example of this is the way that current knowledge is evolving quickly thanks to measurements at the LHC. As a result, it is impossible to predict decades in advance what the best machine option will be to expand our knowledge. Pursuing several options and starting long-term R&D well in advance is therefore essential for particle physics because it allows the community to be prepared for the future and to make informed decisions when the right moments arise.

For the post-LHC era, several high-energy accelerator options are already under study. Beyond high-luminosity extensions of the LHC programme, new possibilities include: a higher-energy proton collider in the LHC tunnel, as well as various electron–positron colliders, such as the International Linear Collider (ILC) and the Compact Linear Collider (CLIC); and a muon collider. There is typically much cross-fertilization and collaboration between these projects and there is no easy answer when it comes to identifying who has contributed to a particular project.

When, some months ago, we were discussing the authoring of the CLIC conceptual design report, we faced exactly such a dilemma. The work on the CLIC concept has been ongoing for more than two decades – clearly with a continuously evolving team. On the other hand, the design of an experiment for CLIC has drawn heavily on studies carried out for experiments at the ILC, which in turn have used results from earlier studies of electron–positron colliders. Moreover, we also wanted both the accelerator studies and the physics and detector studies to be authored by the same list.

We looked at how others had dealt with this dilemma and found that in some cases, such as in the early studies for LHC experiments, protocollaborations were taken as a basis for authoring, while others, such as the TESLA and Super-B projects, have invited anyone who supports the study to sign. For the CLIC conceptual design report we opted for a list of “signatories”. Those who have contributed to the development are invited to sign alongside those wishing to express support for the study and the continuation of the R&D. Here non-exclusive support is meant: signing-up for CLIC is not in contradiction with supporting other major collider options under development.

The advantage of the signatories list is that it provides the opportunity to cover a broader range of personal involvements and avoids excluding anyone who feels associated or has been associated with the study. The drawback of our approach is that the signatories list does not pay tribute in a clear way to individual contributions to the study. This recognition has to come from authoring specialized notes and publications that form the basis of what is written in the report.

The signatories list covers both the CLIC accelerator and the report for the physics and detector conceptual design. Already exceeding 1300 names in February, it demonstrates that – even if all eyes are on LHC results – simultaneous R&D for the future is considered important.

Are there better ways of doing this? As the projects develop, the teams are becoming more structured and this helps – at least partly – towards creating appropriate author lists. The size of the teams and the particular timescale of the projects will, however, remain much larger than the first accelerator projects in our field, and it is likely that striking the right balance between openness and inclusiveness and, on the other hand, restrictions and procedures in this matter will continue to be a difficult subject.

Quantum Engineering: Theory and Design of Quantum Coherent Structures

By A M Zagoskin
Cambridge University Press
Hardback: £45 $80
E-book: $64

9780521113694

Quantum engineering has emerged as a field with important potential applications. This book provides a self-contained presentation of the theoretical methods and experimental results in quantum engineering. It covers topics such as the quantum theory of electric circuits, the quantum theory of noise and the physics of weak superconductivity. The theory is complemented by up-to-date experimental data to help put it into context.

Relativistic Quantum Physics: From Advanced Quantum Mechanics to Introductory Quantum Field Theory

By Tommy Ohlsson
Cambridge University Press
Hardback: £38 $65
E-book: $52

41mu5Ovo8CL

Quantum physics and special relativity theory were two of the greatest breakthroughs in physics during the 20th century and contributed to paradigm shifts in physics. This book combines these two discoveries to provide a complete description of the fundamentals of relativistic quantum physics, guiding the reader from relativistic quantum mechanics to basic quantum field theory. It gives a detailed treatment of the subject, beginning with the classification of particles, the Klein–Gordon equation and the Dirac equation. Exercises and problems are featured at the end of most chapters.

bright-rec iop pub iop-science physcis connect