A Memorandum of Understanding was signed recently in Seoul between a CERN delegation comprising the director of research Roger Cashmore, the advisor on non-member states John Ellis, the CMS resource manager Diether Blechschmidt and Korea University president Jung Bae Kim, on behalf of Korea.
The CERN representatives and Korean project leader, Sung Park, also met Korea’s Minister of Science and Technology, and held discussions with Korean physicists and delegates from industry about the outline of plans for future R&D and the mass-production of the Forward Resistive Plate Chamber for the CMS experiment at CERN’s LHC collider.
Over the past two years, Korea has played an active role in R&D for this major CMS component. In the summer of 1998 an actual-sized prototype was built and successfully tested at CERN, followed by the construction of a second actual-sized prototype. Intensive R&D continues, which is being coordinated by the Korea Detector Laboratory (KODEL).
Thirteen other Korean institutes are associated: Cheju National University, Chonnam National University, Chungbuk National University, Dongshin National University, Kangwon National University, Konkuk, Kyungpook National University, Seonam, Seoul National University, Seoul National University of Education, Sungkyunkwan, Wonkwang and Yonsei.
The signing of the Memorandum of Understanding represents a milestone for basic science research in Korea.
by I I Bigi and A I Sanda, Cambridge Monographs of Particle Physics, Nuclear Physics and Cosmology, ISBN 0 521 44349 0 (hbk £60/$95, 380 pages).
With new results from the classic kaon sector and with new B-factories now coming on line, CP violation is a major boom area in particle physics. This carefully written book would make a useful introduction and guide to the difficult theory of this phenomenon.
by John R Klauder, Cambridge, ISBN 0 521 25884 7 (hbk £55/$85, 300 pages).
Extensions of conventional quantum pictures can sidestep some quantum embarrassments. This book is useful to someone with a deep feel for quantum field theory.
Volume XIII in the Frascati Physics Series contains the proceedings of the memorial meeting held in Frascati in November 1998. The meeting marked the 20th anniversary of the death of electron-positron collider pioneer Bruno Touschek. The meeting was reported in CERN Courier.
Touschek made physics and physicists realize the importance of particle-antiparticle colliders, and opened the door to one of the most fruitful periods of particle physics research. Touschek himself was also an interesting and flamboyant figure. The presentations at the meeting underlined the importance of his contributions and his special character.
Chapters include: The Frascati decision and the AdA proposal, by Giorgio Salvini; Remembering Bruno Touschek, his work his personality, by Carlo Bernardini; From AdA to ACO – reminiscences of Bruno Touschek, by Jacques Haïssinski; The ADONE results and the development of the quark-parton model, by Massimo Testa; Electron-positron storage rings from ADA to LEP, by Emilio Picasso; Physics at present electron-positron colliders, by Guido Altarelli; Physics at DAFNE, by Paolo Franzini; Status of DAFNE, by Miro Andrea Preger; The physics at an e+e_ linear collider, by Marcello Piccolo. The book also has a list of Touschek’s scientific papers, some photographs and a few of Touschek’s sketches.
For more information on Touschek see The Bruno Touschek legacy, by Edoardo Amaldi.
by Dima Bardin and Giampiero Passarino, Oxford, International Series of Monographs on Physics, August 1999, ISBN 0 19 850280 X (hbk £80, 680 pages).
The past decade of particle physics experiments has been devoted to the testing of the standard electroweak theory, mainly at LEP, SLC and the Tevatron. The goal has been to probe the theory at the quantum-loop level by comparing the quantitative predictions on radiative corrections to experimental data, for as many measurable quantities as possible.
From the theoretical side, the preparation of these precision tests has been a tremendous task that has involved hundreds of theorists for over 20 years. This book offers a complete compendium of the techniques and results in the calculation of radiative corrections.
No other book offers a complete, exhaustive and authoritative description of the electroweak theory predictions for precision tests. All calculations are described in detail and the results are reported explicitly. Different techniques and approaches are introduced and compared. Most of the results are explicitly derived and discussed. The tree level results and the quantum corrections for all relevant physical processes and quantities are studied in detail.
The exposition is clear and only a basic knowledge of quantum field theory is assumed. Thus, the book qualifies as a complete reference handbook for this domain of contemporary physics. Those interested in the overall physical picture and the main implications of precision tests can find more-readable reviews elsewhere. However, this work will be invaluable for professional theorists looking for state-of-the-art reviews.
by Tim Berners-Lee and Mark Fischetti, Harper, San Francisco, 1999, ISBN 0 060 251586 1 ($26).
If you’ve ever wondered what goes on in the mind of an inventor you could do a lot worse than delve into Tim Berners-Lee’s Weaving the Web. In it he and co-author Mark Fischetti explain the origins of the ideas that are now revolutionizing the communications landscape, and the vision that lies behind them.
From a childhood spent discussing maths at the breakfast table and building mock-up replicas of the Ferranti computers his parents worked on, Berners-Lee moved on to building his own computer out of salvaged pieces of electronics and an early microprocessor chip.
In 1980, he went to CERN on a six-month contract. There he wrote a hypertext program called Enquire to help him keep track of the complex web of who did what on the accelerator controls project he worked on. Back at CERN at the end of the decade, Berners-Lee transported the idea behind Enquire to the Internet, with the now well known results.
Berners-Lee’s book is a very personal account, and it’s all the more readable for that. Like most of us, Tim Berners-Lee has a mind that’s better at storing random associations than hierarchical structures. And, like most of us, his mind is prone to mislaying some of those associations. Enquire began as an effort to overcome that shortcoming and evolved into something much bigger.
Berners-Lee is an idealist, driven by the desire to make the world a better place and the profound belief that the Web can do that. Now far from the rarefied air of a pure research laboratory, Berners-Lee gives credit to the atmosphere in which his ideas were allowed to mature. “I was very lucky, in working at CERN, to be in an environment… of mutual respect and of building something very great through a collective effort that was well beyond the means of any one person,” he explained. “The environment was complex and rich; any two people could get together and exchange views, and even end up working together. This system produced a weird and wonderful machine, which needed care to maintain, but could take advantage of the ingenuity, inspiration, and intuition of individuals in a special way. That, from the start, has been my goal for the World Wide Web.”
If the structure of DNA and the nature of life qualify as the most profound discoveries of the 20th century, what will be those of the next? Such are the questions beloved by pundits as the calendar turns in this special year. We know that these questions are unanswerable. However, to appreciate quite how unpredictable the future is, it may be worth imagining that we were being asked such a question 100 years ago.
In the late 19th century some scientists believed that the basic principles of physics had been discovered and only the details remained to be worked through. An outstanding problem – the apparently perverse behaviour of the spectrum of radiation emitted by hot bodies – was solved 100 years ago with the invention of quantum theory. Our view of the world was utterly changed. Can we draw any parallels with the present? Perhaps the writer of a similar article 100 years from now will give the answer.
As the 19th century drew to its end, three great discoveries defined the 20th century and illuminated the nascent science that we now call particle physics. In less than three years, between late 1895 and 1897, Röntgen discovered X-rays, Becquerel found radioactivity and Thomson isolated the electron. For me, the discovery of X rays and the electron typify the one hundred year leap from then to where we are today.
Ask a member of the general public about X-rays and they will think of shadows of broken bones: ask a scientist, and they will point to X-ray crystallography. As Röntgen prepared to receive the first Nobel prize in physics in 1901, no-one foresaw Bragg’s work in X-ray crystallography, let alone that, half a century later, Crick and Watson would use this tool to resolve the structure of DNA. What many would regard as the most profound discovery in biology is, by many readers of CERN Courier, recognized as applied physics.
Genetics in the 21st century is likely to be as revolutionary as electronics has been in the 20th century. It is electronics, and all else that has flowed from the discovery of the electron, that touches most people in our field today.
Nature’s fundamental pieces
J J Thomson marched into the Royal Institution on the 30 April 1897 and announced his discovery of the electron, a fundamental constituent of all atomic elements. After Thomson duly won a Nobel prize (1906), for showing that the electron is a particle, his son, G P Thomson subsequently won the prize in 1937, for showing that the electron is a wave. That, however, is another story.
Jump forward in time to the late 1960s, and beams of electrons, accelerated over a distance of 3 km, were fired into targets of protons and neutrons at SLAC. These experiments showed that the cosmic onion does not end with the atomic nucleus. The ultimate nuclear constituents (for the 20th century at least) are the quarks.
The century had begun with the belief that atomic elements were nature’s fundamental pieces. It ended with the discovery of electrons and quarks. The electron is but one member of a family of six, known as leptons; there are six varieties of quark as well. No-one at the end of the 20th century knows for certain why six of one and half a dozen of the other is nature’s scheme, but the answer will probably be known by the end of the 21st century.
In 1897 J J Thomson, alone in his Cambridge laboratory, discovered the electron by means of a small glass tube, which was less than 27 cm long. By 1997 electrons were speeding around CERN’s LEP ring (a journey of 27 km) to meet their nemesis, positrons, which were unknown to Thomson but, mysteriously, known to mathematics before their discovery by humans.
It was with the discovery of the positron, and the anti-world, that the electron revealed the deep power of mathematics. In 1928, Paul Dirac took the two great theories of the 20th century – relativity and quantum mechanics – and applied them to the electron. The mathematics simply would not balance.
The greatest implication of Dirac’s equation (as it will be known for all time) was that it opened a window to an entirely new world. His equation had two solutions, one being the familiar negatively charged electron, while the other implied the existence of a bizarre mirror-image version, identical in all respects except that the sign of its electrical charge is positive rather than negative. This anti-electron, more succinctly known as the positron (positive electron), is an example of antimatter.
Dirac’s prediction of the positron seemed to many at the time to be science fiction. Up to that point the only particles known or predicted, existed as constituents of the matter around us, namely electrons in the periphery of atoms and protons and neutrons, which comprise the atomic nucleus. The positron, which had emerged from his equations like a rabbit from a magician’s cloak, had no place at all. However, the questions ended in 1932 when the positron was found in cosmic radiation, with a positive charge and an identical mass to its electron sibling.
Dirac’s theory, that for every particle there exists an antiparticle counterpart, is now recognized to be an essential truth: a glimpse of a profound symmetry in the fundamental tapestry of the universe. And here we have another of the great puzzles that are with us at the turn of the century. If, as experiment suggests, the Big Bang created particles of matter and antimatter in equal amounts, and they annihilate upon meeting, how is it that there is any material universe remaining? Where has all the antimatter gone? Crick and Watson revealed the nature of life as we now know it, but how did the universe manage to survive long enough, made of matter, to have provided the necessary circumstances for life to emerge?
While the annihilation of matter and antimatter is a puzzle for understanding our existence, it is, nonetheless, the annihilation of the simplest pieces, electrons and positrons, that has been the key to LEP. Accelerated around the 27 km ring, the collisions of positrons electrons and their mutual annihilations produce, in a small volume, for a brief moment, energies that are far greater than are found in any star and akin to those prevalent in the universe when it was less than a billionth of a second old. Particles of matter and antimatter pour out from these “mini bangs”, replaying the basic processes that occurred at the Big Bang. To capture the fleeing particles, which are travelling close to the speed of light, huge detectors are required.
Recreating the Big Bang
It is when you stand alongside one of these behemoths and compare it with the little tube that Thomson used, that you see 100 years of science and technology in metaphor. It was relatively easy for Thomson to isolate the electron because the universe had already done much of the preparatory work. Over the previous 10 billion years, electrons had been created, trapped in atomic elements and stored there in the solid matter of the new-born Earth until we arrived. They were ubiquitous in 1897 Cambridge. A small tube and genius, then asset-stripped the atoms aided by relatively primitive tools (which used electric and magnetic forces to move the electrons around) to reveal their existence and their properties.
Today, by contrast, we are looking at exotic forms of matter: heavy quarks, supersymmetric particles and the Higgs boson, all of which, theorists believe, existed briefly in the afterglow of Creation but now are no longer here. To find them, we have to restore the conditions of the new-born universe.
There are no mass produced test tubes that can make an experiment of such magnitude. There is no customized “Big Bang apparatus” for sale in the scientific catalogues so that we can experience the first moments of the universe in our living rooms. This is not mere hype. To journey to the start of time you have to build all the pieces for yourself, by transforming the earth, rocks and gases of our planet into tools that extend our senses. That is how it has been at LEP, and how it will be for CERN’s LHC.
Sand provides the raw material for the nervous system of computer chips, which will orchestrate the enterprise. Hydrogen gas, from which protons can be stripped, will supply the beams for the LHC. Ores dug from the ground, melted, transformed and turned into magnets will be capable of guiding beams of protons at 99.9999% of the speed of light. A myriad of other tools, which are the result of centuries of invention, are being assembled. When all is done, these tools of the millennium will reveal the universe, not as it is now, but as it was at Creation. This is a far cry from Thomson’s day.
The results will be gathered, not by a single person as in Thomson’s case, but by a team of thousands, working on several continents, communicating via the internet, which is powered by electrons. Not only science and technology, but even the sociology of research has evolved over these past one hundred years.
Our story began with the first Nobel prizes. Becquerel’s discovery of radioactivity (1903) is where I shall complete the tale. The agents of the weak force, the cookers of the elements, were discovered at CERN in the 1980s. At LEP in the 1990s, millions of examples of these Z and W particles have been made and measured to astonishing precision. As the mathematics of Dirac revealed the existence of the positron, so has ‘t Hooft and Veltman’s theory of the weak force enabled quantitative descriptions of the measurements at LEP. LEP had insufficient energy to materialize a top quark. Nonetheless, courtesy of ‘t Hooft-Veltman mathematics, its properties can be deduced in advance of its triumphant discovery at Fermilab. Now, on the threshold of the 21st century, we are in an analogous situation with the Higgs boson. The precision of LEP and the mathematics give us foresight of another research prize.
So the theory of ‘t Hooft and Veltman, earning the final physics Nobel prize of the 20th century, may be giving us a glimpse of one of the first great breakthroughs of the 21st century. Will the discovery of the Higgs boson, and its associated phenomena, turn out precisely as the theorists expect? Or will there be some unexpected twists: the first hints of profound truths that are at present beyond our ken? Theorists throughout history have created beautiful descriptions of the universe, often with astonishing implications. Ultimately it is experiment that decides by distinguishing fact from fancy.
What will CERN Courier be celebrating in its issue 100 years hence? Röntgen, Becquerel and Thomson could not have imagined DNA, Zs and Ws, electroweak theory, the World Wide Web, nor LEP and LHC (machines that take us to the start of time). If there is any message from this that we can be sure is a guide for the coming century, it is this: prepare for surprises.
The year is 2005, CERN’s LHC collider is running, and discoveries are on the horizon. Pete Bruecken and Jeff Dilks, high school teachers from Iowa, are telling students about their experiences of building detectors and carrying out beam tests at CERN. The students are even more interested when they learn that they will be analysing some new data from the LHC experiments that their teachers had a part in building.
As part of the QuarkNet programme, hundreds of teachers with similar experiences will have their students doing the first analyses of LHC datasets. These will be small datasets, filtered to be useful to students. This is an exciting opportunity because no-one else has analysed these data yet. There is always the possibility that the students will be part of an important discovery; the odds may be small but the potential is enormous. Furthermore, they will be communicating with students in other classrooms around the world, comparing notes about their findings and viewing the action at CERN, live via the Web. They will also be learning basic physics. Ultimately QuarkNet will reach 720 teachers and over 100 000 students.
Teachers Bruecken and Dilks were among 24 teachers who joined QuarkNet in 1999. After a week at Fermilab in June, learning about particle physics, they participated in seven weeks of research, which was funded by QuarkNet. Together with Professor John Hauptman at Iowa State University, Bruecken and Dilts constructed an incredibly fast detector, which, essentially, could collect energy and spatial information at the speed of light and then empty the calorimeter of signal in a nanosecond. Hauptman said, “The amazing thing about this module is that it was largely built on zero funds… and QuarkNet was essential for its success.”
The local newspaper reported, “Just imagine it: high school students watching cutting-edge particle physics experiments, analysing data and collaborating with scientists. How’s that for science homework?” QuarkNet plans to fly a number of students to CERN for the first physics runs so that they can report back on the events.
Introductory physics is present in much of high-energy physics. For students, concepts such as conservation of momentum and energy are ubiquitous. Particle physicists use these concepts as they study the fundamentals of nature. Why not let students explore classical physics through the lens of particle physics? Wouldn’t this bring much more interest to their studies?
QuarkNet seeks to create such a lens. The project’s main goal is to involve high school students and teachers in the ATLAS and CMS experiments as well as in Run 2 of the CDF and DØ experiments at Fermilab. A year ago, Keith Baker (Hampton University), Marge Bardeen (Fermilab), Michael Barnett (Lawrence Berkeley National Laboratory) and Randy Ruchti (University of Notre Dame) organized the project. To carry out the programme, QuarkNet has hired four teachers/educators to run the summer activities, assist the centres in the development of their programmes and help monitor the success of the project. QuarkNet is supported by the US National Science Foundation, the US Department of Energy and the participating universities and laboratories. While QuarkNet began in the US, there have been expressions of strong interest from CERN and from other European countries.
Teachers aboard experiments
QuarkNet invites teachers to join groups of particle physics experimenters (their mentors) for an eight-week summer research assignment. This immersion in research gives the teachers time to become familiar with the experiments and provides them with an overview of particle physics. Physicists, from a university or laboratory, recruit the teachers from nearby schools. The institution’s needs and the teacher’s personal skills determine the research assignment. QuarkNet provides a stipend (a salary) for these teachers and, for those who leave home for extended periods of time, living expenses.
During the academic year, teachers invite their students into the project by integrating some aspect of their summer work into their physics curriculum. This does not mean that students must study the Standard Model. Students could study the conservation of momentum via analysis of data from a collider event. They could also discover the vital role of computers in modern science by examining thousands of events: a task that is impossible to do by hand. They may consider protons moving through the LHC as they investigate the force that magnetic fields exert upon moving, charged particles. Each of these, and other curriculum ideas, will be developed by the teachers and QuarkNet staff as the programme matures.
During the school year, after their summer research assignment, QuarkNet teachers invite 10 other teachers from their area into the project. These associate teachers participate in a three-week institute, which is planned and hosted by the QuarkNet teachers and their local physicist mentors. Here they explore particle physics research and the classroom application of classical physics topics to the world of particle physics.
QuarkNet centres
This group of 12 teachers and at least two physicist mentors comprise a QuarkNet centre. During the summer of 1999, QuarkNet established 12 centres at universities and laboratories from California to Massachusetts, and in many places in between.
Teachers participated in a one-week orientation workshop at Fermilab in preparation for the summer research assignments. During the week, the teachers attended talks on everything from accelerators to cosmology, and enjoyed tours of CDF and DØ along with explanations of upgrades for Tevatron Run II. They worked with hand held cosmic-ray detectors brought from Notre Dame, and engaged in computer activities using Fermilab Run I data, computer simulations and material from the Web. The workshop featured time for teachers to pose questions to principal investigators Randy Ruchti and Michael Barnett and to synthesize a deeper understanding of physical phenomena. In addition, teachers discussed the classroom implementation of their research work on one of the four major collider experiments.
During their summer research the teachers took on varied and challenging projects. Larry Wray and Rosemary Bradley of Langston University in Oklahoma, under mentor Tim McMahon, worked on a project related to the “powers of ten”. Ulrich Heintz at Boston University involved Rick Dower in using LabView to write interface and data-collection software for measuring the characteristics of a silicon tracker wafer to be used in DØ. He did this and then repeated his measurements in a neutron beam, generated by the low-energy (approximately 4 MeV) proton accelerator at the University of Massachusetts in Lowell, to test the effect of radiation on the wafer.
The CMS project
Much of the work for QuarkNet 1999 involved the CMS project at CERN. Kevin McFarland, of the University of Rochester, had Susen Clark and Paul Pavone test the long-term stability of scintillating crystals (to be used as reference standards for CMS). These two teachers also built a “muon telescope” cosmic-ray detector for classroom demonstrations. The work of the Iowa centre in CMS was explained well by John Hauptman of Iowa State University: “Nural Akchurin in Iowa City and I, in Ames, have a lot of good work to do, and Jeff [Dilks] and Peter [Bruecken] were right in the middle of it. Peter analysed radiation damage data, designed and built mechanical mounts for a new calorimeter, and next week he will start taking data in the LEP injector beam at CERN. Jeff was responsible for designing and building a new calorimeter in Ames, testing it at CERN this summer, and analysing data from it.”
At Notre Dame, LeRoy Castle and Dale Wiand worked with Randy Ruchti to design Optical Decoder Units (ODUs) for CMS. Both teachers became involved in negotiation with other CMS production sites to find satisfactory solutions to questions on how to best place the ODUs in the detector structure.
How does this experience influence teaching and learning? Students in Ames, Iowa were performing an experiment in their physics class. They had divided up the parameter space of a dataset so that they could save class time, but still cover the necessary measurement parameters. Jeff Dilks had his students share their measurements by writing their data on the white board. A plot of the measurements showed absolutely nothing. Over the weekend Dilks considered his options. He resolved to use this opportunity to show his students that science does not always yield what is expected.
On the Monday he started class by informing the students that their work truly models what goes on in the “real world” of science. The results that they had shared on Friday were nonsense and indicated that new and more precise measurements were required. The class discussed what changes could be made, assigned parameters and performed their measurements once again. This time a quick plot of those measurements showed some interesting results. The lesson was learned; science is an involved process that has starts and stops, and it often yields results that beg more questions.
For about twenty years, I was a member of the theory group at the Institute of Theoretical and Experimental Physics, Moscow (ITEP). The ITEP was more than an institute, it was our refuge where the insanity of the surrounding reality was, if not eliminated, reduced to a bearable level.
Doing physics there was something which gave a meaning to our lives, making it interesting and even happy. Our theory group was like a large family. As in any big family, of course, this did not mean that everybody loved everybody else, but we knew that we had to stay together and rely on each other, no matter what, in order to survive and to be able to continue doing physics. This was considered by our teachers to be the most important thing, and this message was always being conveyed to young people joining the group. We had a wonderful feeling of stability.
Rules of survival
The rules of survival were quite strict. First, seminars – the famous Russian-style seminars. The primary goal of the speaker was to explain the results, not merely to advertise them. And if the results were nontrivial, or questionable, or just unclear, this would surface in the course of the seminar, and the standard two hours were not enough. Then the seminar could last for three or even four hours, until either everything was clear or complete exhaustion, whichever came first. I remember one seminar in Leningrad in 1979, when Gribov was still there, which started at eleven in the morning. A lunch break was announced from two to three, and the seminar continued until seven in the evening.
In ITEP we had three, sometimes more, theoretical seminars a week. The leaders and the secretaries of the seminars were supposed to find exciting topics, either by recruiting ITEP or other “domestic” authors, or, often, by picking up a paper or a preprint from elsewhere and asking somebody to report the work to the general audience. This was considered a moral obligation.
The tradition dated back to when Pomeranchuk was the head of the group, and its isolation had been even more severe. In those days there were no preprints, and getting fresh issues of Physical Review or Nuclear Physics was not taken for granted. When I, as a student, joined the group a few years after Pomeranchuk died, I was taken to the Pomeranchuk memorial library, his former office, where a collection of his books and journals was kept.
Every paper in every issue was marked with a minus or a plus sign, by “Chuk’s” hand. If there was a plus, there would also be the name of a student who had been asked to give a talk for everyone’s benefit. Before the scheduled day of the seminar, Pomeranchuk would summon the speaker to his office to assess whether the subject had been worked out sufficiently and the speaker was “ripe enough” to face the audience and their bloodthirsty questions.
Scientific reports of the few chosen to travel abroad were an unquestionable element of the seminar routine. Attendance at an international conference by no means was considered as a personal matter. Rather, these lucky guys were believed to be our ambassadors, and were supposed to represent the whole group. This meant that at a conference, you could be asked to present important results of other members of the group. Moreover, you were supposed to attend as many talks as possible, including those which did not belong to your field, make extensive notes, and, on your return, deliver an exhaustive report of all new developments, interesting questions raised, rumours, etc.
The rumours, as well as nonscientific impressions, were like an exotic dessert. I remember that after a visit to the Netherlands, one colleague mentioned that he was very surprised to see people on the streets smiling. He could not understand why. Then he finally figured it out: “because they were not concerned with building communism.” This remark almost immediately became known, and cost a few years of “inexplicable allergy” to any western exposure.
“Coffee seminars” typically lasted until nine, sometimes much later, for instance, in the stormy days of the 1974 “November revolution”. The few months following the discovery of J/psi were the star days of quantum chromodynamics, and probably the highest emotional peak of the ITEP theory group. Never were the mysteries of physics taken so close to our hearts as then. A spontaneously arranged team of enthusiasts worked practically nonstop. A limit to our discussions was set only by the Moscow metro – those who needed to catch the last train had to leave before 1 a.m. Living in the capital of that empire had its advantages. All intellectual forces tended to cluster in the capital. So, we had a very dynamic group where virtually every direction was represented by several theorists, experts in the given field. If you needed to learn something new, there was an easy way to do it. It was much faster and more efficient than reading through journals or textbooks. You just needed to talk to the right person.
Educating others, sharing your knowledge and expertise with everybody who might be interested, was another rule of survival. Different discussion groups and large collaborations were emerging all the time, creating a strong and positive coherent effect. The brain-storming sessions used to produce, among other results, a lot of noise. So, once you were inside the old mansion occupied by the theorists, it was very easy to figure out which task force was where – you just had to step out into the corridor and listen.
Fashionable physics
There is strong pressure in the world community to stay in the mainstream: to work only on fashionable directions and problems under investigation in dozens of other laboratories. This pressure is especially damaging for young people who have little alternative. Of course, a certain amount of cohesion is needed, but the scale of the phenomenon we are witnessing is unhealthy.
The isolation of the ITEP theory group had a positive side effect. Everybody, including the youngest members, could afford to work on unfashionable problems without publishing a single line for a year or two. On the other hand, it was considered indecent to publish results of dubious novelty, incomplete results, or just papers with too many words per given number of formulae.
Dense papers were the norm. This style, probably perceived by readers as a chain of riddles, is partly explained by tradition, presumably dating back to the Landau times. It was also due to Soviet conditions, where everything was regulated, including the maximal number of pages any given paper could have.
A collaboration agreement between CERN and the International Science and Technology Centre, finalized in November and worth some 12 million Swiss francs, is a large step forward in CERN-ISTC co-operation. The proposed agreement covers equipment for the big ATLAS and CMS experiments. From 2005 these experiments will use CERN’s new LHC proton collider. The agreement is within the framework of the ISTC Partnership Project (more next month). Contributions to all such ISTC projects had previously amounted to about $14 million and the new CERN projects add some $8 million to this sum.
The lion’s share of this new co-operative effort goes towards the lead tungsten crystals for the CMS experiment’s electromagnetic calorimeter. ISTC scientists will also deliver the huge wheels (24 m in diameter) to support the muon chambers on the outside of the ATLAS experiment. This mutually profitable new avenue for research and development work should lead to fresh proposals and contracts.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.