The story of a forerunner to today’s mobile-phone screens.
“The proton synchrotron currently being built by CERN (the SPS) will be controlled centrally from three control desks, each with its own minicomputer. Only a few knobs and switches must control all of the many thousands of digital and analogue parameters of the accelerator, and an operator will watch the machine on at most half-a-dozen displays … An advantage of the new form of control is that since there are so few controls and displays, they may be made more elaborate and powerful.”
Thus begins a CERN report written in May 1973 by Frank Beck and Bent Stumpe of the controls group (Beck and Stumpe 1973). It describes two devices: the touch screen and the computer-controlled knob. CERN’s member states had approved the construction of the Super Proton Synchrotron (SPS) in February 1971. With its circumference of nearly 7 km, it was a giant machine for its day – some 10 times the size of the Proton Synchrotron (PS) that had started up in 1959. The scale of the new machine meant that control via individual cables linking directly to a central control room – as was done for the PS – would be economically unfeasible. One of the first tasks of the nascent SPS controls group, therefore, was to find a practical and economical solution.
The timing was just right for developing central control supported by computers. Industry was beginning to commercialize minicomputers, so the idea began to take shape of equipping sectors locally with minicomputers controlled by message transfer from the central control room. This would overcome the enormous requirement for cables. The next question was how to create an “intelligent” system based on minicomputers to replace the thousands of buttons, switches and oscilloscopes that a conventional control system would need for a machine as large as the SPS.
A human has only two hands, but if control devices could be redefined fast enough by computer, then only one button (or knob or pointing device) would be needed to do the job of controlling many different devices or parameters. The main uses of the “master button” would be to select accelerator subsystems for control and monitoring, as well as to select from hundreds of analogue signals the ones to show on displays at any one time. The minicomputers made by Norsk Data at the time seemed to be powerful enough for such a system.
Frank Beck, who was to become head of the SPS Central Controls, was aware of the possibilities offered by existing touch screen technology in which a panel of buttons with labels written by computer can be changed simply by touch to control different aspects of a system. By presenting successive choices that depend on previous decisions, the touch screen would make it possible for a single operator to access a large look-up table of controls using only a few buttons.
It was clear that the only practical way to create buttons with variable labels by computer at that time was on a cathode-ray tube (CRT) screen. The question then was how the computer could detect which button was being selected. The rather complicated mechanical designs that existed did not seem suitable for the SPS control system. For example, David Fryberger and Ralph Johnson at SLAC had invented a device based on acoustic waves – Rayleigh waves – travelling in the surface of a sheet of glass, which had already been used for accelerator control (Fryberger and Johnson 1971). This worked but required a bulky frame around the screen. Beck discussed this with his colleague Stumpe, from the Data Handling Division, and asked if he could suggest a better technical solution.
In a handwritten note dated 11 March 1972, Stumpe presented his proposed solution – a capacitative touch screen with a fixed number of programmable buttons presented on a display. It was extremely simple mechanically. The screen was to consist of a set of capacitors etched into a film of copper on a sheet of glass, each capacitor being constructed so that a nearby flat conductor, such as the surface of a finger, would increase the capacity by a significant amount. The capacitors were to consist of fine lines etched in copper on a sheet of glass – fine enough (80 μm) and sufficiently far apart (80 μm) to be invisible (CERN Courier April 1974 p117). In the final device, a simple lacquer coating prevented the fingers from actually touching the capacitors.
Stumpe was immediately recruited into the controls group to develop the necessary hardware and the first capacitor to prove that the idea worked was produced at CERN in 1973. Chick Nichols was able to use ion-sputtering equipment available in one of the workshops to evaporate a fine layer of copper or gold on a flexible, transparent Mylar sheet to make the first working device. A prototype glass screen with nine touch buttons followed soon after.
The fineness of the lines and their pitch meant that a great deal of care was needed to produce the screen, but it turned out to be possible with the techniques normally used to make printed circuit boards. At first, placing the copper layer on the glass appeared difficult and it proved impossible to get reliable adherence with vacuum deposition. However, ion sputtering gave better results. By ensuring that the glass was scrupulously clean and by depositing the copper slowly – an hour for a layer of about 10 μm – it was possible to get adherence strong enough to allow soldered connections to the glass.
The capacitance of each button was about 200 pF, increasing by about 10% when a finger came close. The method chosen to detect the change in capacitance was to use a phase-locked oscillator circuit, which had recently become available as a single integrated-circuit chip. One circuit acted as a reference oscillator, while each button had a similar circuit. The oscillator attached to a button locked in to the frequency of the reference oscillator (120 kHz), so that a change in capacity altered the phase but not the frequency. The phase shift was converted to a voltage shift, which indicated that the button had been touched. The circuit was very immune to noise and transients. Moreover, any drifts would be common to both oscillators, so good thermal stability could be obtained with commercial components.
Into production
As soon as it was clear that the system could successfully recognize which of the nine buttons was touched, Beck showed the prototype to those in charge of the SPS project. Even before reliability tests had been performed, the decision was taken to use the touch-screen system and begin development of the control software on the first minicomputers (Nord 1, and later Nord 10) that CERN had received from Norsk Data. This was definitely a risk, but had the decision not been made then the control group would have had no option but to use conventional technology for central control of the SPS. Tests later proved the reliability of the technique.
The next step was the development of a more practical touch screen with 16 buttons. The new central SPS control room needed several devices and industry was soon involved. Manufacturing of the touch screen itself proceeded in collaboration with a Danish company, Ferroperm. This led to the development of a robust glass screen with reduced surface reflections. At the same time another Danish company, NESELCO, became involved in producing the electronic modules needed to drive the touch screen.
When the SPS started up in 1976 its control room was fully equipped with touch screens – apparently the first application of the capacitative touch screen in the world. Touch screens later took their place in modernized control systems for the PS, which had preceded the SPS by nearly 20 years, as well as for the subsequent and much bigger Large Electron Positron collider. Some of these screens continued to operate until the new CERN Control Centre took over operations in 2006 – a lifetime of 30 years.
In 1977 CERN demonstrated the potential of the new touch screen for industrial control in no lesser a place than the huge and famous Hanover Fair. In the hall for new industrial inventions, CERN presented the “Drinkomat”, with a complete operational console similar to the one used to control the SPS, including a Nord 10 computer. The system was built by Alain Guiard, who at the time was using a touch screen to control a large film-development installation at CERN, which allowed exact control of the liquids used in the process. Through multiple choices on a touch screen, the Drinkomat allowed people to mix drinks and follow the process visually, foreshadowing the machines that came into CERN’s cafeterias nearly 30 years later.
By 1977 the capacitative touch screen was already available commercially and being sold to other users within CERN and to other research institutes and companies wishing to use the screens in their own control systems (Crowley-Milling 1977). Its use spread around the world: JET and the Rutherford Laboratory in the UK; KEK, Mitsubishi and the TOYO corporation in Japan; the Rigshospitalet in Denmark and the Hahn-Meitner Institute in Germany.
One reason behind the success of the system was a decision at CERN to build electronic modules in the CAMAC system, used not only all over CERN but throughout the world. This made it easy for users to buy individual modules for integration into their own systems. By 1980, more than 10 different CAMAC modules developed at CERN had been brought to the market by NESELCO. Furthermore, a CAMAC module with an integrated computer for driving the touch screen was developed in 1977, shortly followed by a CAMAC crate computer using the Motorola 68000 microprocessor. These modules were integrated into an intelligent “Touch Terminal”, which was commercialized by NESELCO in 1980; it was the world’s first commercial touch-screen computer.
At CERN the Touch Terminal was used for the control of the Antiproton Accumulator, which allowed the SPS to become a proton–antiproton collider and gather fame for CERN through the discovery of the W and Z bosons and the subsequent awarding of the Nobel prize to Carlo Rubbia and Simon van der Meer.
The original touch screen had only 16 fixed “buttons” associated with distinct areas of the screen, but already in 1978 it was obvious that a more flexible arrangement for dividing up the screen would have many advantages. Stumpe developed his original concept to create an X–Y touch screen, in which the idea was to sense the position touched via two layers of capacitors corresponding to X and Y co-ordinates. Following prototype work at CERN, development began with NESELCO and the University of Aarhus, supported by the Danish state development funds. The X–Y screen involved new techniques for metallization on various substrates, which became the subject of patent rights. Stumpe was asked to sign a nondisclosure agreement, which he refused to do because CERN required that all inventions should be published. At this point, CERN’s involvement with the further development of touch screens came to an end.
The new CERN Control Centre (CCC), which oversees the control of CERN’s entire accelerator complex, including the PS, SPS and now the LHC, has no touch screens for accelerator control. Today the use of the ubiquitous mouse as a pointing device provides the same type of computer control. Moreover, PC-based systems with standard displays are inexpensive and easy to install. In 1972, when the touch screen was developed at CERN for controlling the new SPS, the situation was different: nothing was commercially available and every control device had to be invented, including the colour displays.
However, touch screens are undoubtedly not absent from the CCC, as the operators often communicate with colleagues by mobile phones with capacitative touch screens. The idea invented at CERN in 1972 has been reinvented in many applications, from “Drinkomats” to rail and airline ticket machines to the multifunction phones that sit in many pockets – not only in the CCC but all around the world.