Comsol -leaderboard other pages

Topics

CERN Council opens the door to greater integration

At its 155th session, on 18 June, the CERN Council opened the door to greater integration in particle physics when it unanimously adopted the recommendations of a working group that was set up in 2008 to examine the role of the organization in the light of increasing globalization in particle physics.

“This is a milestone in CERN’s history and a giant leap for particle physics,” said Michel Spiro, president of the CERN Council. “It recognizes the increasing globalization of the field, and the important role played by CERN on the world stage.”

The key points agreed at the meeting were:

• All states shall be eligible for membership, irrespective of their geographical location;

• A new associate membership status is to be introduced to allow non-member states to establish or strengthen their institutional links with the organization;
• Associate membership shall also serve as the obligatory pre-stage to full membership;
• The existing observer status will be phased out for states, but retained for international organizations;

International co-operation agreements and protocols will be retained. “Particle physics is becoming increasingly integrated at the global level,” explained CERN’s director-general Rolf Heuer. “The decision contributes towards creating the conditions that will enable CERN to play a full role in any future facility, wherever in the world it might be.”

CERN currently has 20 member states: Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the UK. India, Israel, Japan, the Russian Federation, the US, Turkey, the European Commission and UNESCO have observer status. Applications for membership from Cyprus, Israel, Serbia, Slovenia and Turkey have already been received by the CERN Council, and are currently undergoing technical verification. At future meetings, Council will determine how to apply the new arrangements to these states.

In other business, Council recognized that further work is necessary on the organization’s medium-term plan, in order to maintain a vibrant research programme through a period of financial austerity, and endorsed CERN’s new code of conduct.

Full details of the new membership arrangements can be found in Council document CERN/2918, which is available at http://indico.cern.ch/getFile.py/access?resId=1&materialId=0&contribId=35&sessionId=0&subContId=0&confId=96020.

Lake Views: This World and the Universe

by Steven Weinberg, Belknap Press/Harvard University Press. Hardback ISBN 9780674035157, $25.95.

CCboo2_06_10

This book collects some essays and book reviews written by Steve Weinberg between the years 2000 and 2008. They were written in his study at home, from where the author can see Lake Austin. In 25 chapters he covers an impressive number of subjects ranging from military history to his review of Richard Dawkins’ book The God Delusion, passing through fundamental physics, missile defence, the boycott of Israeli academics and even offering some advice to young students and postdoctoral fellows.

As with previous books, one is captivated by the depth and breadth of his knowledge, the elegance of his prose and his intellectual honesty. In each chapter there is a preamble where he explains the origin of the article, whether it was asked for by different journals or as an exposition to a learned society; an afterword reveals some of the reactions his views have elicited.

An important part of the book is dedicated to the current theory of multiverses and string landscapes. To a certain extent all of these developments were inspired by his remarkable work in the late 1980s (explained in the book) where he used anthropic reasoning to understand (if not explain) the possible value of the cosmological constant, also known as the dark energy of the universe. It is quite remarkable that the value derived from the observations carried out by groups studying galactic redshifts, as well as from the Wilkinson Microwave Anisotropy Probe satellite, are in good agreement with the values favoured by his analysis. The sections of the book describing this work, dealing with Einstein’s famous blunder, are a masterpiece of insight and deep mastery of physics.

In other chapters, covering the humanities or religion, he takes his usual “rationalist, reductionist, realist and devoutly secular” viewpoint. Unlike Dawkins, his discourse is not the one of a “born-again atheist” (my quotes), but rather he explains his point of view in a relaxed form not devoid of humour. The effect of the relevant chapters is probably much stronger in US society, where religion plays a much bigger role than in Europe, where a large number of scientists, humanists, politicians and ordinary citizens would easily agree with his discourse. He raises provocation to the level of an art.

Another theme addressed in these essays is the ongoing discussion with philosophers or theologians on the notion of whether science explains only the “how” and not the “why” of things. He makes it very clear that the laws of nature have no purpose, and that the only legitimate purpose of science is to understand the basic laws that rule the universe. Finality is not the aim of science, but that does not make it a lesser element in the human endeavour to understand the universe that we live in.

Weinberg has not lost his punch. Far from that. This book is thought provoking, informative, challenging and fun to read. A single fault: it is too short.

The Edge of Physics: A Journey to Earth’s Extremes to Unlock the Secrets of the Universe

by Anil Ananthaswamy, Houghton Mifflin Harcourt. Hardback ISBN 9780547394527, $25. Paperback ISBN 9780547394527 $15.95.

CCboo1_06_10

In his recent book The Edge of Physics, Anil Ananthaswamy, a science writer for New Scientist, covers the most extreme physics and astronomy experiments that are set to uncover the secrets of neutrinos, dark matter and dark energy, galaxy formation, supersymmetry and extra dimensions. The author takes us on an extraordinary journey over five continents to tour the best telescopes and particle detectors, from the summits of the Andes to deep down in the Soudan mine, stopping by the South Pole and paying a visit to CERN. Following him on this trip is already exciting, but reading his account of discussions with physicists, astronomers and engineers along the way is simply fascinating. He tells us about each experiment as he discovered them through discussions with the scientists involved. For example, he writes about the ATLAS experiment through the eyes of Peter Jenni, Fabiola Gianotti and François Butin, with added insight from a meeting with Peter Higgs.

This makes for lively reading about all of these experiments. He not only tells us the most striking details about how each one was built, but he also includes accurate information about the science and technology behind them, avoiding clichés in his efforts to make it understandable to all. You read about his stay at Lake Baikal, discussing neutrino physics with enthusiastic and dedicated physicists such as Igor Belolaptikov and Ralf Wischnewski, sharing stories and vodka with them on the shores of Lake Baikal in the midst of winter – the only time that the photomultiplier tubes of the underwater neutrino experiment can be serviced from the frozen surface of the lake. The reader learns about the scientific research at all of these places through personal accounts from the scientists involved. At times, it felt as if I was meeting old friends at a conference and hearing their best stories about their experiments, sharing their enthusiasm and discovering unknown details about their research.

The Edge of Physics also allowed me to learn more about the best astronomy instruments, some located in idyllic places such as Hawaii, while others are under construction in the least life-sustaining places, such as in the Karoo desert in South Africa or the Hanle Valley in India. Ananthaswamy’s book is as much a tribute to the science as it is to the dedicated scientists pushing the limits of knowledge. His clear explanations and entertaining style will appeal to scientists and non-scientists alike. A book not to be missed.

EUAsiaGrid discovers opportunity in diversity

More than half of the world’s people live in Asia. Even putting aside the two titans India and China, there are some 600 million inhabitants – 100 million more than in the entire EU – in the region that is commonly referred to as South-East Asia. From Myanmar in the west to Indonesia’s Papua province in the east, the territory is nearly twice the width of the continental US. Most of the Asian partners in EUAsiaGrid hail from this region, which has more than its fair share of natural disasters in the form of earthquakes, volcano eruptions, typhoons and tsunamis, not to mention enduring political tensions.

Despite these challenging circumstances, EUAsiaGrid has managed to make a significant impact in a relatively short time. This has been driven by increased sharing of data storage and processing power between participating institutions in the region. It was achieved through a concerted effort by the project leaders to encourage the adoption across the region of the gLite middleware of Enabling Grids for E-sciencE (EGEE), which is the same middleware used by the Worldwide LHC Computing Grid (WLCG).

As the head of EUAsiaGrid, Marco Paganoni, who is based at INFN and the University of Milan-Bicocca, points out: “This technological push has enabled researchers in some of the participating countries to become involved in international science initiatives that they otherwise might not be able to afford to participate in.”

EUAsiaGrid owes its origins to the pioneering efforts of the global high-energy physics community

Like many other international Grid projects, EUAsiaGrid owes its origins to the pioneering efforts of the global high-energy physics community to promote Grid technology for science, and to the nurturing role of the European Commission in spreading Grid technical know-how throughout the world through joint projects. In addition, a key catalyst for EUAsiaGrid has been Simon Lin, project director of Academia Sinica Grid Computing (ASGC). His efforts established ASGC as the Asian Tier-1 data centre for WLCG. He and his team have been bringing Asian researchers together for nine years at the annual International Symposium on Grid Computing (ISGC) held each spring in Taipei.

The EUAsiaGrid project, launched as a “support action” by the European Commission within Framework Programme 7 in April 2008, focuses on discovering regional research benefits for Grid computing. “We realized that identifying and addressing local needs was the key to success in this region,” says Paganoni. From the outset, capturing local e-science requirements was an important component of the project’s objectives. Moreover, comparing those requirements revealed a great deal of common ground amid all of the regional diversity.

Earth-shaking experience

One common theme was the region’s propensity for natural disasters and the ability of Grid technology and related information technology solutions to help mitigate the consequences of such events. For example, EUAsiaGrid researchers have helped build links between different national sensor-networks, such as those of Vietnam and Indonesia. Researchers in the Philippines are now benefiting from the Grid-based seismic modelling experience of their Taiwanese partners. Sharing data and Grid know-how in this manner means that the scientists involved can better tune local models of earthquake and tsunami propagation.

At the most recent ISGC, which was held in March, a special EUAsiaGrid Disaster Mitigation Workshop devoted a day to the latest technological progress in monitoring and simulating both earthquakes and tsunamis. Nai-Chi Hsiao of the Central Weather Bureau in Taipei explained in a talk about the early-warning system for Taiwan that it takes just 60 s for an earthquake to travel from the south to the north of the island, leaving precious little time to make a decision about shutting down nuclear reactors or bringing high-speed trains to a grinding halt and so avoid the worst consequences that a large earthquake might cause.

Where could Grid technology fit into this picture? The island is rocked by earthquakes, both large and small, all of the time. It is simply not viable to shut down power plants and stop trains every time that a tremor is detected. What is needed is a quick prediction of the impact that a particular earthquake may have on key infrastructure across the island. However, the level of shaking that an earthquake produces 100 km away can depend strongly on, for example, the depth at which it occurs.

There is certainly no time to do a full simulation once an earthquake is detected. According to Li Zhao of the Institute of Earth Sciences at Academia Sinica, it might instead be possible to pull out a pre-processed simulation from a database and make a quick decision based on what it predicts. This would require processing and storing the results of simulations for a huge number of possible earthquake epicentres – a task that is well suited to Grid computing.

Neglected diseases

Another common thread of the research sponsored by EUAsiaGrid has been searching for cures to diseases that plague the region but which have been largely neglected by pharmaceuticals companies because they do not affect more lucrative markets in the industrialized world.

Consider dengue fever, for example. For most sufferers, the fever and pain produced by the disease pass after a very unpleasant week, but for some it leads to dengue haemorrhagic fever, which is often fatal. Like malaria, dengue is borne by mosquitoes. But unlike malaria, it affects people as much in the cities as it does in the countryside. As a result, it has a particularly high incidence in heavily populated parts of South-East Asia and it is a significant source of infant mortality in several countries.

As yet there are no drugs designed to specifically target the dengue virus. So EUAsiaGrid partners launched an initiative last July called Dengue Fever Drug Discovery, which will start a systematic search for such drugs by harnessing Grid computing to model how huge databases of chemical compounds would interact with key sites on the dengue virus, potentially disabling it.

This is not the first time that Grid technology has been used to amplify the computing power that can be harnessed for such ambitious challenges. Malaria and avian influenza have been targets of previous massive search efforts, dubbed by experts “in-silico high-throughput screening”.

Leading the effort on dengue at Academia Sinica in Taipei is researcher Ying-Ta Wu of the Genomics Research Centre. He and colleagues prepared some 300,000 virtual compounds to be tested in a couple of months, using the equivalent of more than 12 years of the processing power of a single PC. The goal of this exercise was not just to get the processing done quickly but also to encourage partners in Asia to collaborate on sharing the necessary hardware, including institutes in Malaysia, Vietnam and Thailand.

It is not just hard sciences such as geology and biology that benefit from Grid know-how. Indeed, as Paganoni notes: “Modelling the social and economic impacts of major disasters and diseases is a Grid-computing challenge in itself, and is often top of the agenda when EUAsiaGrid researchers have discussions with government representatives in the region.”

Even the humanities have benefited from these efforts. Capturing culture in a digital form can lead to impressive demands for storage and processing. Grid technology has a role to play in providing those resources. For instance, it can take more than a week using a single desktop computer to render a 10-minute recording of the movements of a Malay dancer performing the classical Mak Yong dance into a virtual 3D image of the dancer, using motion-capture equipment attached to the dancer’s body. Once this is done, though, every detail of the dance movement is permanently digitized, and hence preserved for posterity, as well as being available for “edutainment” applications.

The problem, however, is that a complete Mak Yong dance carried out for ceremonial purposes could last a whole night, not just 10 minutes. Rendering and storing all of the data necessary for this calls for Grid computing.

Faridah Noor, an associate professor at the University of Malaya, became involved in the EUAsiaGrid project because she saw great potential for Grid-enabled digital preservation of traditional dances and artefacts for posterity. She and her colleagues are working on several projects to capture and preserve digitally even the most ephemeral cultural relics, such as masks carved by shamans of the Mah Meri tribe used to help cure people of their ailments or to ward off evil. The particular challenge here is that the shamans deliberately throw the masks into the sea as part of the ritual, to cast away bad spirits.

As Noor, who works in the area of sociolinguistics and ethnolinguistics, points out: “We have to capture the story behind the mask.” Each mask is made for an individual and his or her illness, so capturing the inspiration that guides the shaman while preparing the mask is as important as recording the way in which he carves the wood, and rendering 3D images of the resulting mask.

An important legacy of the EUAsiaGrid project, Paganoni says, will be the links that it has helped to establish between researchers in the natural sciences, the social sciences and the humanities, both within South-East Asia and with European institutions. These links trace their origin to a common interest in exploiting Grid technology.

 

• Based on articles previously published in International Science Grid This Week, with permission.

Collide – a cultural revolution

Would you employ me to run the LHC? Or perhaps to run an experiment at CERN with antimatter? After all, I have an abiding interest in physics – ever since an inspiring science teacher sparked my imagination with the Van de Graff generator and the laws of gravity. I have no expertise and little experience in physics – just a school girl’s love of equations combined with joyful enthusiasm and a wish to understand and engage with what it is that makes the world work.

view1

Now turn this question round: would you ask a physicist to devise an arts programme or CERN’s first cultural policy for engaging with the arts? What would your answer be? All right, I admit it. This is deliberate provocation. So let me explain.

Much has been written about the two cultures – art and science. It is a false distinction, which was imposed in the Age of Enlightenment and which in the 21st century we are finally beginning to shake off. Leonardo da Vinci made no such distinction between art and science. Aristotle most definitely did not. As the physicist-turned-poet Mario Petrucci says: “I have found that the rigour and precision of the scientist is not foreign to the poet, just as the faith-leaps of poetry are not excluded from the drawing boards of science.” The arts and science are kissing cousins. Their practitioners love knowledge and discovering how and why we exist in the world. They just express it in different ways.

Where there is a distinction between art and science – which has contributed to the this misunderstanding of how intimately related they really are – is in the ways in which people’s work is judged and evaluated. Cultural knowledge and expertise in the arts can seem totally mystifying. Why is one artist judged as great, and another not? There are no equations to evaluate and therefore no absolutes. The arts seems to be a muddy water of individual will, taste and whimsical patronage. But this is a simplistic distortion of a more complex and nuanced picture.

Arts specialism is all about knowledge and understanding. It is about knowing inside-out the history of art forms – whether dance, music, literature, the visual arts or film – and possessing the expertise to evaluate contemporary work; to spot the innovative and the boundary-bursting, as well as the great and exceptional. History lies at its heart – arts knowledge exists on a space–time continuum of reflection and understanding of the creative processes. Moreover, at the heart of this is what every scientist understands: peer review. Experts who are used to working with artists, who know what they are realistically capable of, as well as understanding the past and therefore the present and the future, choose and select projects and individuals for everything from exhibitions and showings, to competitions and grants, for example.

Which takes us to a new bold and brave experiment at CERN and my presence there. Don’t worry. I am not tinkering with the LHC. The collisions and interactions that I will be working with are all of the cultural kind. My expertise, knowledge and experience is in the field of arts and culture – 25 years of working in that arena, working with science too. The director-general, Rolf Heuer, has the vision and the wish to express the crucial inter-relationship of arts and science that makes culture. To do this, I am raising the partnerships and funds for “Collide” – an international arts residency programme at CERN – in which artists will come every year from different art forms to engage with scientists in a mutual exchange of knowledge and understanding through workshops, lectures and informal talks, and to begin to make new work. Who knows what the artists will create? Or the scientists for that matter? A spark chamber of poetry? A dance that defies gravity? Light sculptures that tunnel into the sky? Who knows? That depends on the serendipity of who applies and how they interact with whom and what is at CERN.

Crucially, the artists in residence will be selected by a panel of leading scientists working alongside leading arts specialists – directors, producers, curators, artists – so that mutual understanding and appreciation of how cultural knowledge works, and how expert judgements are made, can develop and be exchanged. This is one of the key strategies of a new cultural policy for engaging with the arts that I have devised for CERN. After all, great science deserves great art. Nothing less will do for the place that pushes knowledge to beyond known limits.

Nevertheless, at the heart of the arts at CERN is the critical connection between the lateral and logical minds that artists and scientists both have. “Collide” will be a way of showing this, of encouraging scientists and artists to work together in a structured programme of interplay and exchange. It will also be a way of creating an all-encompassing vision of CERN to the outside world and on different platforms – from stage and screen to canvas and the orchestra – showing CERN’s status as a major force in culture, as befits the home of the LHC and what some consider is possibly the biggest, most significant experiment on Earth.

The Standard Model and Beyond

by Paul Langacker, CRC Press. Hardback ISBN 9781420079067, £49.99 ($79.95).

book1

The Standard Model of elementary particles and their interactions via the electromagnetic, weak and strong interactions is a fabulously successful theory. Tests of quantum electrodynamics have been made to a precision at the level of better than one part in one billion; electroweak tests approach the one part in one hundred thousand level; and even tests of quantum chromodynamics, which are intrinsically more challenging, are being made at the per cent level.

Yet, despite this, we are still sure that the Standard Model cannot be the “ultimate” theory. We have yet to account theoretically for the exciting observations of the recent decades, namely, massive neutrinos, dark matter and dark energy, which provide direct evidence for new physics processes. We cannot account for the observed patterns of the masses of the fermion building blocks of matter, their manifestation in three generations or “families”, the “mixing” between the generations, or why the universe seems to contain almost no antimatter. And we don’t yet understand how to incorporate gravity in terms of a quantum-field theory.

Theoreticians have not been idle in developing models of the new physics that could underlie the Standard Model and that ought to manifest itself at the tera-electron-volt energy scale, such as alternative spontaneous electroweak symmetry-breaking mechanisms, supersymmetry and string theories, for example. However, within the framework of the Standard Model itself, we have yet to observe the Higgs boson, the presence of which is required to account for the generation of the masses of the W and Z particles.

This substantial book – at more than 600 pages – gives a detailed and lucid summary of the theoretical foundations of the Standard Model, and possible extensions beyond it.

Chapter 1 sets up the required notations and conventions needed for the ensuing theoretical survey. Chapter 2 reviews the basics of perturbative field theory and leads, via an introduction to discrete symmetry principles, to quantum electrodynamics. Group theory, global symmetries and symmetry breaking are reviewed in Chapter 3, which forms the foundation for the presentation of local symmetries and gauge theories in Chapter 4, where the Higgs mechanism is first introduced.

The heart of the book lies in Chapters 5 (strong interactions), 6 (weak interactions) and 7 (the electroweak theory), which at more than 170 pages is the most substantial. These chapters present a clear theoretical discussion of key physical processes, along with the phenomenology required for a comparison with data, and a brief summary of the relevant experimental results. Precision tests of the Standard Model are summarized, and the framework is introduced for parametrizing the head-room for new physics effects that go beyond it.

The final chapter summarises the known deficiencies of the Standard Model and introduces the well developed extensions: supersymmetry, extended gauge groups and grand unified theories. Fortunately, now that the LHC is up and running, we should expect to start to address experimentally at least some of these theoretical speculations. LHC results will provide the sieve for filtering the profound and accurate, versus the merely beautiful and mathematically seductive, models of nature.

The book ranges over huge swathes of theoretical territory and is self-consciously broad, rather than deep, in terms of coverage. I heartily recommend it to particle physicists as a great single-volume reference, especially useful to experimentalists. It also provides a firm, graduate-level foundation for theoretical physicists who plan to pursue concepts beyond the Standard Model to a greater depth.

 

CERN@school brings real research to life

CCsch1_04_10

School is where students study what is in textbooks and university is where they start doing research. Or so most people think. It therefore comes as a surprise to discover that teenagers still at school can participate in a research programme that allies space science and earth science. While sceptical educators would argue that in a normal situation teachers have no time, energy, motivation or money for such projects, Becky Parker at Simon Langton Grammar School in the UK has proved that the opposite can be true.

Inspired during a visit to CERN in 2007, she decided to bring leading-edge research to her school. Instead of going back with a simple presentation about how CERN works, Becky took back a real detector and started sowing ideas about how to set up a real research programme, which she called CERN@school. Her ideas fell on fertile ground as her school in Canterbury, in the county of Kent, is one of the most active in implementing innovative ways of teaching science in the UK. One of the school’s declared goals is to “provide learning experiences which are enjoyable, stimulating and challenging and which encourage critical and innovative thinking”. Students at Simon Langton Grammar School do not just study science, they do it.

“During one of my visits to CERN, I had the opportunity to meet Michael Campbell of the Medipix collaboration, and his young enthusiastic team,” recalls Becky. “They showed me the Timepix chip that they were developing for particle and medical physics. I thought that something like this could be used in schools for conducting experiments with cosmic rays and radioactivity.”

Cross-collaboration

The Timepix chip is derived from Medipix2, a device developed at CERN that can accurately measure the position and energy of single photons hitting an associated detector. The most recent success of the Medipix collaboration is the Medipix3 chip, which is being used in a project to deliver the first X-ray images with colour (energy) information. Initially designed for use in medical physics and particle physics, the Timepix chip now has applications that include beta- and gamma-radiography of biological samples, materials analysis, monitoring of nuclear power-plant decommissioning and electron microscopy, as well as the adaptive optics that are used in large, ground-based telescopes.

The students at Simon Langton Grammar School use the Timepix chip by connecting it directly to their computer via a USB interface box. “The box was developed by the Institute of Experimental and Applied Physics in Prague,” explains Becky. “They also developed the Pixelman software that we use to read out the data.” The chip and the box have a certain material cost but the software is made available for free by the Medipix collaboration.

Given the simple set-up and its relatively low cost, Becky’s idea can potentially be transferred to many other schools across the UK and elsewhere in Europe. “Collaboration is a key factor in modern research,” confirms Becky. “And, like in a real scientific collaboration, we are going to involve as many schools as possible in our project. We have received funding from Kent to put 10 Timepix chips into the county’s schools to create a network. This will allow us to show students how you do things at CERN and in other big laboratories.”

By setting-up a network, schools will collect large amounts of data on cosmic rays. “In the future we hope to have Timepix detectors in schools across the world. Participating schools will be able to send data back to us because we have powerful IT facilities and we can store large quantities of data,” says Becky. “We know that in other countries, such as Canada, Italy and the Netherlands, there are similar school programmes that collect data on cosmic rays. It would be ideal if we could all join our efforts and integrate all of the collected data together.”

Timepix in space

Nothing is out of reach for Becky’s ambitious teaching methods, not even deep space. In 2008 the school’s students decided to enter a national competition run by the British National Space Centre to design experiments that will fly in space. Next year, Surrey Satellite Technology Ltd will fly the Langton Ultimate Cosmic ray Intensity Detector (LUCID), a cosmic-ray detector array designed by Langton’s sixth-form students, on one of its satellites. “The students are learning so much from working on LUCID with David Cooke at Surrey Satellite Technology Limited and Professor Larry Pinsky from the University of Houston,” says Becky.

In LUCID, four Timepix chips are mounted on the sides of a cube (figure 1). Students have demonstrated that the four chips allow for the largest active area without breaking power and data transmission limits. A fifth chip, mounted horizontally on the base of the cube, will be modified to detect neutrons. LUCID’s electronics, including a field-programmable gate array for read-out, will be on printed circuit boards attached to the chips.

CCsch2_04_10

The Timepix detectors produced at CERN do not qualify for use in space. “At one of the last stages of the competition, we were told that our project would go through if we could raise the additional £60,000 needed to qualify the Timepix detectors for space,” Becky recalls. Thanks to the support of the South East England Development Agency and Kent County Council the money was found and LUCID could go into space. LUCID will be mounted outside the spacecraft’s fuselage, housed in a 3 mm (0.81 g cm–2) or 4 mm (1.08 g cm–2) enclosure. Components will mostly be on an inside face of the board offering a further 0.25 g cm–2 of shielding. The detector will also have to be qualified to withstand a vibration level of 20 grms.

Under Becky’s plans, data from LUCID will be compared with data collected by detectors installed on Earth, thus providing information about cosmic rays. “We expect terabytes of data each year from space. We will receive support from the UK Particle Physics Grid (GridPP) to use the Grid. It is a whole research package!,” she says.

The CERN@school project is not the only scientific project that Simon Langton Grammar School students are carrying out. “We collaborate with Imperial College on a research project in plasma physics. One of our students won the ‘Young Scientist of the Year’ prize and published a paper in a proper scientific journal. Others participate in a scientific project for the observation of exoplanets using the Faulkes Telescopes in Hawaii and Australia,” says Becky.

In addition the school hosts special projects in biology and in other branches of science, and also has its own research centre, the Langton Star Centre. This facility, still under construction, will have laboratories and training and seminar rooms. “We will be able to train teachers and students from other schools who want to take part in CERN@school and our other projects,” explains Becky. The centre’s website will include pages where data and analysis results from the network of participating schools will be shared.

These innovative teaching approaches benefit both students and teachers. The school’s philosophy is that 30% of the activities carried out must be beyond the official syllabus. The outcome is that the school provides about 1% of the total number of students studying for physics and engineering degrees at British universities. At the same time, motivating the teachers becomes much easier when they have the prospect of participating in real research programmes in collaboration with CERN, for example.

CCsch3_04_10

Many young people at school do not know what it would be like to study physics or engineering at university and do forefront research. However, when they get to work with the real scientists, they discover how amazing this is and readily jump aboard ambitious programmes. “If teachers let students take control in these kinds of projects, they will not mess around – they are going to do all of this properly because they know that this is serious stuff,” assures Becky. “With my students, I am quite rigorous. I tell them that they are going to do it like real scientists. And because this is really an amazing thing to be involved with, they do it properly and with a lot of enthusiasm.”

Becky’s attitude to “her” students, whom she calls “sweethearts”, is a far cry from that of teachers who say how difficult it is to control behaviour in schools and motivate students every day. So why is Becky’s experience so different? “I am in a lovely school,” she explains. “The cool thing to do at my school is physics. A 12 year old came to me last year and said: ‘Miss, we would like you to teach us quantum physics’ and so I did it.”

Becky’s initiative to foster the knowledge of “cool” physics in the region includes the “Langton Guide to the Universe”, in which parents are invited to attend physics lectures on modern and exciting physics. “Families come and receive a first input on things like quantum mechanics. Some kids who attended those lectures when they were very young later joined the school and set up the ‘quantum working group’, which produced a guide to how to teach quantum mechanics to the youngest. They have entered a national competition and reached the final.” These are the sort of expectations that you can have when you go to Simon Langton Grammar School. As Becky explains: “Our philosophy is that if students are interested in doing a scientific project, however ambitious, they can come and talk to us. This is your world, take the initiative and make it successful!”

Gell-Mann: quantum mechanics to complexity

CCman1_04_10

To celebrate Murray Gell-Mann’s many contributions in physics in his 80th year, the Institute of Advanced Studies at Nanyang Technological University and the Santa Fe Institute jointly organized the Conference in Honour of Murray Gell-Mann, which took place in Singapore on 24–26 February. Aptly entitled “Quantum Mechanics, Elementary Particles, Quantum Cosmology and Complexity” to focus on Gell-Mann’s achievements in these fields, the three-day conference was a festival of lectures and discussions that attracted more than 150 participants from 22 countries. Those in attendance included many of Gell-Mann’s former students and collaborators. For a select few this was their second visit to Singapore, having attended the 25th Rochester Conference held there 20 years ago.

The meeting began with a brief scientific biography of Gell-Mann presented by his close collaborator Harald Fritzsch of Ludwig-Maximilians University, who highlighted his main achievements. During the 1950s Gell-Mann worked with Francis Low on the renormalization group and with Richard Feynman on the V-A theory of weak interaction. The application of the SU(3) symmetry group to classify hadrons led Gell-Mann to predict the existence of the Ω particle in 1961; its subsequent discovery in 1964 paved the way to his receiving the Nobel Prize in Physics in 1969. Gell-Mann and George Zweig independently proposed quarks as the constituents of hadrons in 1964.

Gell-Mann studied the current algebra of hadrons together with various co-workers. In 1971 he introduced light-cone algebra together with Fritzsch, as well as the colour quantum number for quarks. A year later they proposed the theory of QCD for the strong interaction. In 1978 Gell-Mann, Pierre Ramond and Richard Slansky proposed the seesaw mechanism to explain the tiny neutrino masses. Then, in around 1980, Gell-Mann switched his interest towards the foundations of quantum mechanics, quantum cosmology and string theory.

Multifaceted

Gell-Mann’s interests extend beyond physics – he loves words, history and nature. He has moved between disciplines that include historical linguistics, archaeology, natural history and the psychology of creative thinking, as well as other subjects connected with biological and cultural evolution and with learning. He currently spearheads the Evolution of Human Languages Program at the Santa Fe Institute, which he co-founded.

The subsequent talks by Nicholas Samios of Brookhaven National Laboratory and George Zweig of Massachusetts Institute of Technology (MIT) were very entertaining. They touched on the historical background that led to the discovery of the Ω – predicted by Gell-Mann’s Eightfold Way – and to the quark model of hadrons, and were accompanied by interesting anecdotes and photographs. Zweig related the origin of the terminology “quark” and how the battle between “aces” and quarks unfolded.

There were several talks on recent advances in various theoretical and experimental aspects of QCD as well as on the Higgs boson. CERN’s John Ellis discussed the Higgs particle and prospects for new physics at the LHC. Nobel laureate C N Yang of Tsinghua University gave a talk on his recent work on the ground-state energy of a large one-dimensional spin-1/2 fermion system in a harmonic trap with a repulsive delta-function interaction, based on the Thomas-Fermi method. Gerard ‘t Hooft of Utrecht University – another Nobel laureate – presented a possible mathematical relationship between cellular automata and quantum-field theories. This may provide a new way to interpret the origin of quantum mechanics, and hence a new approach to the gravitational force.

CCman2_04_10

Gell-Mann himself ended the first day’s sessions with interesting personal recollections and reflections on “Some Lessons from 60 Years of Theorizing”. His main observations can be summarized as follows. First, every once in a while, it is necessary to challenge some widely conceived idea, typically a prohibition of thinking in a particular way – a prohibition that turns out to have no real justification but holds up progress in understanding. It is important to identify such roadblocks and get round them. Second, it is sometimes necessary to distinguish ideas that are relevant for today’s problems from ones that pertain to deeper problems of the future. Trying to bring the latter into today’s work can cause difficulties. Finally, doubts, hesitation and messiness seem to be inevitable in the course of theoretical work (and experiments too, sometimes). Perhaps it is best to embrace this tendency rather than organizing over and around it, for example, by publishing alternative contradictory ideas together with their consequences, and leaving the choice between them until a later time.

The following day and a half covered a variety of topics. Rabindra Mohapatra of the University of Maryland discussed neutrino masses and the grand unification of flavour. Further talks focused on the origins of neutrino mixing and oscillations, as well as on what the LHC might reveal about the origin of neutrino mass.

John Schwarz of Caltech gave an interesting review of the recent progress in the correspondence between anti-de Sitter space and conformal field theory, which is one of the most active areas of modern research in string theory. He focused mainly on the testing and understanding of the duality and the construction and exploration of the string theory duals of QCD. Other talks reported on string phenomenology and string corrections in QCD at LHC. Itzhak Bars of the University of Southern California described a gauge symmetry in phase space and the consequences for physics and space–time.

The sessions on quantum cosmology covered topics on black holes, dark matter, dark energy and the cosmological constant. These included a talk by Georgi Dvali of New York University, who discussed the physics of micro black holes.

The main sessions of the conference ended with a talk by Nobel laureate Kenneth Wilson of Ohio State University, a former student of Gell-Mann. He touched on a fundamental problem: could the testing of physics ever be complete? According to Wilson, in the real world no law about continuum quantities such as time, distance and energy can be established to be exact through experimental tests. Such tests cannot be carried out today, and cannot be done in the foreseeable future – although estimates of uncertainties can be improved in future. Wilson also took part in a discussion session with school teachers and students in a Physics Education Meeting held in conjunction with the conference.

CCman3_04_10

The parallel sessions on particle physics, cosmology and general relativity attracted presentations by more than 30 speakers, many of whom were young physicists from Asia (China, China (Taiwan), India, Indonesia, Iran, Japan, Malaysia and Singapore). There was also a special session on quantum mechanics and complexity featuring invited speaker Kerson Huang of MIT who gave a talk on stages of protein folding and universal exponents.

• To mark the occasion of Gell-Mann’s 80th birthday, the publication of Murray Gell-Mann: Selected Papers, edited by Harald Fritzsch (World Scientific 2010), was launched during the conference.

Reviews of Accelerator Science and Technology Volumes 1 and 2

By Alexander W Chao and Weiren Chou (eds), World Scientific. Volume 1 Hardback ISBN 9789812835208, £55 ($99). E-book ISBN 9789812835215, $129. Volume 2 Hardback ISBN 9789814299343, £81 ($108).

The development of accelerators represents one of the great scientific achievements of the past century. The objective of this new journal – Reviews of Accelerator Science and Technology – is to give readers a comprehensive review of this dynamic and interesting field and of its various applications. The journal documents the tremendous progress made in the field of accelerator science and technology and describes its applications to other domains. It also assesses the prospects for the future development and use of accelerators.

The history and function of accelerators is told from its beginnings and extends to future projects in an extremely competent and complete approach, as the authors have themselves contributed in many ways to the success of the fields presented. The journal shows clearly how progress in science is strongly coupled to advances in the associated instruments, allowing us to see beyond the macroscopic world – into the finer structure of matter – and to apply these instruments to fields such as elementary particle physics, medicine and industry. From the structure of cells, genes and molecules to the Standard Model of elementary particles, the scientific developments are recounted back to the early development of these versatile instruments.

CCboo1_04_10

Volume 1 presents the history of accelerators, from the first table-top machines to the colliders of today and those being planned for the future. It is written in a fashion that serves as a historical account while also providing the scientific and technical basis for a deeper understanding. The volume transmits the spirit of this truly multidisciplinary and international field. With an excellent bibliography for each chapter, together with the historical development of the science of accelerators and the contributions by key figures in the field, it succinctly describes the overall history and future prospects of accelerators.

The articles in this volume include a review of the milestones in the evolution of accelerators, a description of the various types of accelerators (such as electron linear accelerators, high-power hadron accelerators, cyclotrons, colliders and synchrotron-light sources) as well as accelerators for medical and industrial applications. In addition, various advanced accelerator topics are discussed – including superconducting magnets, superconducting RF systems and beam cooling. There is also a historical account of the Superconducting Super Collider, and an article on the evolution, growth and future of accelerators and of the accelerator community.

CCboo2_04_10

Volume 2 focuses on the first of many specific subfields, its theme being medical applications of accelerators. Out of about 15,000 accelerators of all energies in existence today, more than 5000 are routinely used in hospitals for nuclear medicine and medical therapy. The articles in this volume feature overviews of the medical requirements written by physicians; a review of the status of radiation therapy, radioisotopes in nuclear medicine and hospital-based facilities; a detailed description of various types of accelerators used in medicine; and a discussion on future medical accelerators. In addition, one article is dedicated to a prominent figure of the accelerator community – Robert Wilson – in recognition of his seminal paper of 1946, “Radiological Use of Fast Protons”.

These first two volumes of Reviews of Accelerator Science and Technology are timely, instructive and comprehensive. The journal is well laid out and, thanks to the many informative photos and diagrams, it is easy also to read. It is written in an impartial and balanced way and covers the achievements made at several laboratories around the world. To ensure the highest quality, the articles are written by invitation only and the submitted papers have all been peer-reviewed. An editorial board consisting of distinguished scientists has also been formed to advise the editors.

The journal represents an excellent balance between a historical account of the developments in the field and the technical challenges and scientific progress made with such machines. Volume 2 in particular comes at an auspicious moment because the synergies between the science behind accelerators and the related spin-offs, such as the applications of accelerators to fight disease, are of great importance to human health – with a profound impact on our society.

In conclusion, the journal is a tribute to accelerators and the people who developed them. It appeals to the expert as well as to all scientists working and applying the use of accelerators. Active scientists and historians of science will appreciate this chronicle of the development of accelerators and their key role in the progress of various domains during the past century. It should be on the shelf of every scientist working with accelerators and of those with an interest in the history and future directions of accelerators and their applications. I hope that it also inspires students to look deeper into accelerator science and technology and to choose this field as a career.

CERN – the knowledge hub

CCvie1_04_10

If you ask 10 people working at CERN how they would describe what CERN is in a single sentence, the chances are that you will get 10 different answers.

Most people think of CERN, first and foremost, as an accelerator “factory” and a provider of facilities for the experiments. Some would state that it is a high-profile research organization, as well as a formidable training centre. Others will emphasize that it is an attractive and responsible employer. Finally, some may point out that CERN is, among other things, a strong, internationally recognized “brand”.

They are all correct in some way because CERN is a complex system with manifold activities and worldwide impact, to an extent that is sometimes hard to appreciate from an in-house perspective. Personally, I like to think of CERN as a “knowledge hub”. In fact, despite people’s different views on what CERN is, they are all part of its knowledge-exchange network.

Knowledge from universities, research institutes and companies flows into CERN through the people who come to participate in its activities. New knowledge is generated at CERN and knowledge then flows out, for example through R&D partnerships and technology transfer and through those who leave.

CERN is actually more than a hub because it plays the role of an active “catalyser” in the exchange of knowledge. As a concrete example, in February 2010 the “Physics for Health in Europe” workshop took place at CERN. It brought together more than 400 participants – both medical doctors and technology experts from the physics community. Medical experts attending expressed their appreciation that CERN had organized the workshop, acknowledging the need for such cross-cultural and interdisciplinary events, which cannot easily be organized at a national level. The value of CERN both as a provider of technologies and as a catalyst for the community was widely recognized. There are, of course, many other activities where CERN makes similar contributions towards global endeavours, for example, the Open Access initiative and the deployment of a computing Grid infrastructure in Europe.

Some of the knowledge exchanges taking place across CERN’s network are structured, explicit and therefore easy to track. This is the case, for example, with technology-transfer activities, which are typically formalized through contracts that give third parties access to CERN’s intellectual property portfolio. Other knowledge-exchange processes are tacit or informal. For example, knowledge transfer through people’s mobility from CERN towards European companies is hard to track in a systematic way.

The CERN Global Network aims to facilitate knowledge exchange across the various groups described above and to improve the visibility of partnership opportunities related to CERN’s activities. It will also enable CERN to gather data on knowledge transfer through mobility.

This Global Network will welcome former and current members of the CERN personnel (including users), companies from CERN’s member states, universities and research institutes. It will deliver a database of members and a dedicated website, providing information about partnership and knowledge-sharing opportunities (training, new R&D projects, transferable technologies, jobs etc) across the community. It will also foster the creation of special interest groups and organize events at CERN.

The scope of the Global Network is broader than a typical “alumni” association because it aims to build and reinforce links between all of the key players in the knowledge-exchange process – be they individuals or institutions. Interactions between individuals will generate a CERN-specific social and professional network, while interactions between individuals and institutions will create value in areas such as recruitment by linking job seekers with potential employers. Finally, interactions between institutions will enable the exchange of best practice in specific thematic areas.

As a last point, I would like to stress that the importance of knowledge transfer through day-to-day exchanges with the general public cannot be overemphasized. No doubt most readers of this article are routinely asked by ordinary citizens to explain what CERN is. In these circumstances we are all acting as ambassadors for CERN, endowed with the responsibility to remove misconceptions about our field and to explain the role of fundamental research as a driver for innovation.

Contributing to communication with the general public is everyone’s responsibility – the CERN Global Network will provide its members with information about the CERN-related projects that make an impact on society and that can be used to illustrate how CERN concretely delivers value to the community, in addition to its contribution to the advancement of basic science.

Facilitating and catalysing knowledge exchanges are among the most valuable benefits that we at CERN can deliver to society. A few words from George Bernard Shaw suffice to illustrate why: “If you have an apple and I have an apple, and we exchange these apples, then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.”

• For more about the CERN Global Network, see http://globalnetwork.cern.ch.

Claudio Parrinello, head of knowledge and technology transfer, CERN.

bright-rec iop pub iop-science physcis connect