The first of 420 short straight sections for CERN’s LHC collider to be put on test at CERN has been successfully ramped to full current. These short straight sections, containing superconducting quadrupole magnets to keep the beams tightly focused, have been designed and prototyped by the French Atomic Energy Commission laboratory in Saclay and the neighbouring CNRS-IN2P3 laboratory at Orsay. These laboratories will also be responsible for their industrial follow-up as part of France’s special host-state contribution to the LHC project.
After preliminary tests at 9000 A and brief training, the first short straight section was soon ramped up to 13 000 A, the maximum current delivered by the LHC power converters and well above its nominal operating current of 11 870 A, which corresponds to a field gradient of 223 T/m. Following a thermal cycle to room temperature and back to 1.8 K, the short straight section could be ramped up directly to 13 000 A without any further training quenches.
A new beam-extraction system considerably extends the capabilities of the unique Nuclotron accelerator at the Joint Institute for Nuclear Research in Dubna near Moscow. In the past several years the Nuclotron has provided circulating beams of hydrogen, deuterium, helium, carbon and krypton nuclei inside its 250 m superconducting ring with an energy up to 4.2 GeV per nucleon and an intensity of up to 1011 particles per second (for light nuclei).
Initial experiments used targets inside the ring, but the construction of a special system for beam extraction to external detectors is now complete. Last year the system was tested and first extraction of proton beam from the ring achieved.
March saw the second full-scale Nuclotron run with the new nuclear beam-extraction system. The parameters of an extracted deuterium beam with the intensity above 109 particles per second were studied and the beam was supplied to several experiments. The new data are being processed.
The Nuclotron beam-extraction system opens up new horizons for physics research. The Nuclotron construction and physics research are in the framework of a wider international collaboration among the JINR member states.
Physicists at Brookhaven and the State University of New York, Stony Brook, have just completed the construction of the Forward Preshower Detector (FPS), one of four inner tracking subdetectors for the D0 experiment at Fermilab’s Tevatron proton-antiproton collider. Both D0 and the Tevatron are currently undergoing major upgrades prior to the next run, which is set for March 2001.
As experiments at CERN using high-energy nuclear beams have shown, about one ten-thousandth of a second after the Big Bang, its quarks and gluons crystallized into protons and neutrons, changing for ever the texture of the microworld and casting the form for the nuclear matter that dominates our universe. In physics-speak, free quarks became “confined” in subnuclear particles, where they have remained ever since.
Once accomplished, this quark confinement is extremely difficult to unravel, and CERN’s experimental programme using high-energy nuclear beams has shown that recreating these initial Big Bang conditions requires energy, perseverance and insight.
However, the quark/gluon origins of nuclear matter are now well understood. Less well understood is how the complex pattern of nuclear matter that we see around us relates to, and results from, quark-gluon confinement. It is somewhat analogous to the realization that all biology can ultimately be related to the DNA structure of genetic material. However, until the underlying genome structure is revealed through detailed experiments, the connections are difficult, if not impossible, to make and the science remains highly empirical.
Nuclear physics therefore needs a “genome project” to map its quark/gluon structure. This underlies the Electron Laboratory for Europe (ELFE) proposal for continuous electron beams in the 15-30 GeV range. ELFE is promoted by the Nuclear Physics European Collaboration Committee (NuPECC), which is the “parliament” of European nuclear physicists.
This requirement for powerful electron beams has resulted in several studies on both the physics and the machine sides. The first ELFE machine proposal in 1993 was for a “green field” design. A 1997 study at DESY, Hamburg, looked at the possibility of capitalizing on DESY’s plans for a new linear electron-positron collider. The latest machine study, at CERN, aims to exploit the valuable hardware and expertise that will become available when CERN’s 27 km LEP electron-positron collider is finally closed.
LEP will begin its 2000 operations in April, but from October the LEP schedule announces “LEP dismantling”, so the tunnel can be prepared for the LHC proton collider scheduled to begin operations in 2005.
In 1998 Günther Geschonke and Eberhard Keil at CERN published an idea for an electron machine based on “salvaged” LEP components, particularly its valuable super-conducting radiofrequency acceleration cavities. This idea now forms the focus of a conceptual design report, ELFE at CERN, which pulls together the underlying physics objectives, requirements for experiments and experimental areas, and the machine itself.
Cryogenics infrastructure, including helium compressors, installed for LEP will enjoy a prolonged life at the LHC and thus will not be available for ELFE.
The outcome of an intense R&D programme, LEP is now fitted with hundreds of niobium-covered radiofrequency cavities achieving accelerating gradients of 7 MV/m that push its beam energy to 100 GeV and even beyond.
In the ELFE at CERN scheme, these cavities would be redeployed in one straight section of a flat racetrack. The linac itself, with the LEP superconducting cavities, would be 1080 m long, but adding the elements to spread and combine the beams at each end takes the total length to 1500 m. The opposite straight section would have six vertically stacked beamlines to handle the successive passes of the beam, with a 3.5 GeV energy gain on each of the seven passes through the linac. With the required tight momentum spread in the beam (less than one part per thousand) influencing the bending that could be achieved, the two straight sections would be linked by 300 m diameter arcs. The new ring could be accommodated near CERN’s existing North Experimental Area, with experiments in or very near existing halls.
To maximize the scope of the experimental programme, ELFE is foreseen as handling polarized particles. These would be injected at 800 MeV either via a superconducting linac using experience gained from the TESLA international project based at DESY or via racetrack microtrons, as with the MAMI machine at Mainz.
The ELFE at CERN conceptual design report is now seen as definitive and will serve as input to future presentations of the study.
The changing face of physics and physics research is underlined by the increasing use of and emphasis on sophisticated control systems. Once dominated by systems for big particle accelerators, control systems are now widely used in other major facilities, and increasingly in large experiments.
This was demonstrated at the recent International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS), the scientific and technical programme of which covered controls for, among others, particle accelerators, detectors, telescopes, nuclear fusion devices and nuclear reactors.
ICALEPCS’99 saw an increased number of contributions from the plasma physics and astronomical community and also, although to a lesser extent, from the particle detector community. Philippe Charpentier of CERN presented a memorable talk entitled “The evolution of the DELPHI experiment control system – how to survive 10 years of running”.
ICALEPCS looked at all aspects – hardware and software – of experimental physics control systems, but concentrated on how controls can contribute to the success of a major experiment. With this objective in mind, different technology and engineering issues were covered. State-of-the-art software and hardware technologies were reviewed in terms of the possibilities that they offer for dealing with systems of increasing complexity and sophistication within restricted budgets and human resources.
Software
In the software domain, several applications that were described are based on Windows NT using the Common Object Request Broker Architecture (CORBA) as a distributed programming model. Examples are A Goetz (ESRF, France) with “Tango – an object oriented control system based on CORBA”; and C Scafuri (Sincrotrone Trieste, Italy) with “The ELETTRA object-oriented framework for high-level software development”, as a distributed programming model, using Java as a programming language.
Noteworthy is the growth of Windows 98/NT, followed closely by Linux. Both are competing with the more traditional UNIX platforms. Increased geographical distribution of systems as well as requirements relating to remote observation and monitoring lead naturally to the application of the Web and related technologies (J Farthing, JET, UK – “Technical preparations for remote participation at JET”).
The crucial role played by well integrated centralized data repositories was also emphasized by H Shoaee (SLAC, Stanford) – “The role of a central database for configuration management”. Indeed, controls are no longer stand alone systems but rather part of a unity that ties physics to other areas, both technical and administrative, in a Computer Integrated Manufacturing environment.
Although the Experimental Physics and Industrial Control System (EPICS) is still rather popular as a framework and set of tools for developing control system software, both in the US (K White – “The evolution of Jefferson Lab’s control system”) and in some non-US labs, commercial Supervisory Controls and Data Acquisition (SCADA) systems are now penetrating the experimental physics “market” as well (A Daneels, CERN – “What is SCADA?”).
SCADA systems prove to be effective and efficient in controlling infrastructure systems such as vacuum, cryogenics, cooling, ventilation and personnel access, and in controlling experimental physics processes such as some small to medium-sized particle detectors.
In the wake of SCADA, technologies such as OLE for Process Controls (OPC) and SoftPLC are becoming more popular.
Hardware
The hardware domain makes increasing use of commercial Programmable Logic Controllers (PLC) connected to devices via fieldbuses, and of PCI (Peripheral Component Interconnect) and its related standards.
With restricted resources, and individual in-house development minimized in favour of buying industrial systems, the task of experimental physics control specialists is steadily moving towards the integration of these industrial products into an overall comprehensive and consistent control system.
Networks are being re-engineered using 100 Mbit/s Ethernet with GigaEthernet backbones, while the Asynchronous Transfer Mode (ATM) is also considered to be candidate technology for the long distance communication of time-critical accelerator data.
Of particular importance are timing systems (T Korhonen, PSI, Switzerland – “Review of accelerator timing systems”). Telescopes as well as tokamaks and accelerators require highly stable, highly precise and highly flexible timing systems, both for event timing and counter-based systems.
The increasing complexity and sophistication of physics processes leads to the introduction of ever-more complex feedback systems, often themselves relying on measurements that need high data rates. Such high-performance measurements may require sampling rates as high as hundreds of megahertz and state-of-the-art Digital Signal Processors (DSP) (J Lister, CRPP-EPFEL, Switzerland – “The control of modern tokamaks”; J Safranek, SLAC – “Orbit control at synchrotron light sources”; and T Shea, Brookhaven – “Bunch-by-bunch instability feedback systems”).
In particular, new developments in the field of accelerator power supplies are taking advantage of the available digital technology by the use of embedded DSP controllers; the digital generation of high-stability, high-precision reference signals; and real-time algorithms for regulation (J Carwardine and F Lenkszus, Argonne – “Trends in the use of digital technology for control and regulation of power supplies”).
Engineering and management
The frequently unappreciated engineering and management aspects of control systems were also highlighted. The weight of maintenance and adaptation costs in software projects were discussed.
In the context of increasingly elaborate systems and reduced resources, and considering the progress demonstrated by industry in keeping proper control of the lifecycle of software development, project management and engineering have shown their worth in today’s physics world as well. Particular attention was paid to requirements engineering, and emphasis was given to sharing experiences and techniques in these fields. Applications of solutions from the industrial world were also presented and discussed.
News content came from status reports from a variety of control and data acquisition projects of new experimental physics facilities. Among them were the Swiss Light Source (SLS), which is being built at the Paul Scherrer Institute in Villigen, Switzerland (S Hunt, PSI – “Control and data acquisition system of the Swiss Light Source”), and the Spallation Neutron Source (SNS), which is to be built in Oak Ridge, US (D Gurd, Los Alamos – “Plans for a collaboratively developed distributed control system for the Spallation Neutron Source”). Gianni Raffi from the European Southern Observatory summarized the meeting.
As well as the conference, two preconference workshops covered EPICS (Experimental Physics Industrial Control System) and SOSH (Software Sharing), which were organized by M Clausen of DESY and W A Watson of Jefferson Lab, respectively.
During the conference, a round table discussion, “Prospective directions in controls in geographically distributed collaborations”, chaired by W Humphrey (SLAC) and involving H Burckhart (CERN), R Claus (SLAC), J Farthing (JET), D Gurd (Los Alamos) and G Raffi (ESO), focused on the management of projects developed by distributed teams and on the experience with the available technologies for long-distance interaction.
Four tutorials covered special topics: “Cases for requirements capture and tracing” (G Chiozzi, ESO); “Network technology” (G Montessoro, Udine); “Introduction to JAVA” (J P Forestier, OSYX, France); and “Introduction to OPC” (OLE for Process Control) (F Iwanitz, Softing, Germany).
ICALEPCS’99, the seventh biennial conference, was held in Trieste on 4-8 October 1999, hosted by Sincrotrone Trieste. It took place at the “Stazione Marittima”, which has recently been restored as the city’s congress centre.
The meeting was organized by Sincrotrone Trieste in conjunction with the European Physical Society’s (EPS) Interdivisional Group on Experimental Physics Control Systems and the Istituto Nazionale di Fisica Nucleare. The International Scientific Advisory Committee was chaired by D Bulfone of Sincrotrone Trieste and A Daneels of CERN.
The meeting brought together some 400 control specialists from 32 different countries, covering Africa, the US, Asia and Europe, and representing 116 organizations. The proceedings are available at “http://www.elettra.trieste.it/ICALEPCS99/”. An industrial programme included an exhibition and seminars.
During the conference the EPS Experimental Physics Control Systems prize was awarded for the first time. It went to T Wijnands of CERN for an advanced plasma control system for TORE SUPRA.
The collaboration for the Compact Muon Solenoid (CMS) experiment at CERN’s future LHC collider will base its tracker entirely on silicon sensor technology using fine feature size electronics (subject to the approval of an addendum to the experiment’s Tracking Technical Design Report by the LHC committee).
The tracker is the innermost module of the huge detector, picking up signals from particles that are produced by the colliding LHC beams. Building such a 6 m long and 2.5 m diameter high-technology precision instrument for the demands of LHC physics is a major challenge.
The decision to go all-silicon follows unexpectedly rapid recent advances in read-out for microstrip detectors, in the fabrication of sensors on 6 inch diameter silicon wafers, and automated assembly techniques for an all-silicon detector. It is a significant departure from the CMS baseline tracker proposal, which foresaw a central region of silicon devices surrounded by microstrip gas chambers (MSGCs).
The decision was a difficult one for CMS, coming soon after the successful demonstration, in beam tests at the Paul Scherrer Institute in Switzerland, that MSGCs would be equal to the task of particle tracking at the LHC.
In the mid-1990s MSGCs seemed to offer an economical alternative to silicon. In early implementations, however, their performance was found to deteriorate significantly with increased exposure to ionizing particles. This would have ruled them out for prolonged use at the LHC, where unprecedented particle fluxes are expected.
Nevertheless, solutions to these teething problems seemed to be available and CMS chose MSGCs as their baseline proposal, on the condition that certain milestones were reached. These were successfully achieved, but silicon-related technology was advancing in parallel, reducing the cost advantage that MSGCs offered.
A decisive factor in reducing the tracker’s price tag, by almost 6.5 million Swiss francs, was the development by CMS of a CMOS read-out chip using low-cost technology, originally aimed at increasing the compactness of computer chips. With a feature size of 0.25 mm compared with the 1 mm of conventional CMOS chips, the new APV25 chip is certainly compact. What the computer industry did not plan for, however, is that it is also extremely radiation-hard, with lower noise and power consumption than a conventional CMOS chip. This combination of features is ideal for CMS’s needs. It also provided important input to the choice of an all-silicon detector.
The other decisive factor is that silicon detectors are already widely available from industry in large quantities, and their price has been falling. Coupled with new automated assembly techniques developed at CERN, silicon therefore gave CMS the greatest chance of completing its full tracker on time and budget for LHC start-up.
“We believe that a new era is being entered, of assembling an enormous detector using automated assembly techniques,” said Geoff Hall of the CMS collaboration, “which is only possible today with silicon.”
The UK’s Synchrotron Radiation Source (SRS) at Daresbury, near Manchester, is to be replaced by a new machine at the Rutherford Appleton Laboratory near Oxford that should be operational in 2006. Funded by the UK and French governments and by the Wellcome Trust, the “Diamond” synchrotron, operating at several gigaelectron-volts, will serve a several thousand-strong academic and industrial research community spanning a range of scientific disciplines.
Daresbury’s 2 GeV SRS source, completed in 1980, will eventually be phased out, but over the next few years will benefit from additional beamlines. A recent addition is BALLAD (Belfast and Leicester Line at Daresbury).
The SRS was one of the first electron accelerators to be built solely for providing synchrotron radiation. Low wavelength electromagnetic synchrotron radiation is emitted by high-energy electron beams as they are bent, and saps the power of high-energy synchrotrons. In the mid-1950s, scientists realized that this intense radiation could be used in its own right to study molecular and other structures over a range of pure and applied research. Synchrotron radiation users began to group around high-energy electron machines built for particle physics, but such was the appeal of synchrotron radiation research that purpose-built synchrotron radiation sources became popular.
In modern synchrotron sources, the radiation is produced by “insertion devices” placed in the straight sections of the synchrotron to “shake” the beam, rather than using the radiation emitted as the electrons are bent round the arcs of the machine.
The Daresbury lab was established in the 1960s to house the 4 GeV National Institute Northern Acclerator (NINA) electron machine to complement the Nimrod proton machine at Rutherford. With the UK’s decision in the early 1970s to close its national particle accelerators and concentrate particle physics research at CERN, a plan emerged for a dedicated synchrotron electron source – the SRS – to replace NINA and to benefit from the existing Daresbury infrastructure.
A scheme for a major new French national synchrotron radiation source – Soleil – was cancelled last year and France decided instead to become a major partner in Diamond. The substantial Diamond support from the Wellcome Trust, the world’s largest biomedical research charity, underlines the broad appeal of synchrotron radiation as a research tool.
The cancellation of the original French scheme and the switching of UK synchrotron radiation sources from Daresbury to Rutherford have both been the subject of intense debate.
With excitement mounting over new possibilities for “neutrino factories” using muon storage rings, a new experiment at CERN will study in detail the yield of pions produced by high energy protons. The results will provide valuable input to these neutrino factory designs.
The Hadron Production Experiment at the CERN PS proton synchrotron (HARP) is a collaboration of institutes in Austria, Belgium, Bulgaria, France, Italy, Russia, Spain and the UK as well as CERN and the JINR in Dubna, near Moscow.
For a neutrino factory, an intense proton driver accelerator would pile several megawatts of beam power into a specially designed target to make pions, which would then be magnetically collected. These pions would then decay into muons, which in turn would give neutrinos. Existing data on the yield of pions by protons are inadequate for the detailed design of the proton driver and target for such a machine.
HARP has another major objective. The quest for a better understanding of the phenomenon of neutrino oscillation requires an improved knowledge of atmospheric neutrino fluxes. The yield of neutrinos by cosmic rays from outer space hitting nuclei in the atmosphere is a vital input to any calculation on atmospheric neutrino effects. This neutrino flux requires a knowledge of the energy and composition of the primary cosmic rays, and the subsequent production of secondary particles by collisions with nitrogen and oxygen nuclei. The former is steadily improving from information gathered in a new round of balloon flights, but the latter needs additional effort.
HARP will use protons of up to 15 GeV from CERN’s proton synchrotron and detector elements salvaged from previous major experiments at CERN and elsewhere – the prototype Time Projection Chamber from the Aleph experiment at CERN’s LEP electron-positron collider, drift chambers from the Nomad neutrino experiment at CERN, calorimeter modules from the Chorus neutrino experiment at CERN, a spectrometer magnet from Orsay, etc. The aim is to study secondary particle production over a range of angles and energies. With its legacy of inherited equipment, HARP will be up and running very quickly, and expects to take data next year.
The regular IEEE Nuclear Science Symposium is a shop window for developments in technology and instrumentation for both nuclear science and spin-off areas. For both of these directions, one of the highlights of the recent Seattle meeting was the parallel sessions on inorganic scintillators.
Scintillators now form a major focus for calorimetry (energy measurement) in several large high- energy physics detector projects – Belle and BaBar are already in operation with CsI(Tl) crystals, and even bigger projects are under construction, such as the lead tungstate investment for the CMS and ALICE experiments at CERN’s future LHC collider. Other future studies, such as BTeV at Fermilab, and some nuclear physics studies also focus on lead tungstate.
These groups have organized profitable collaborations with crystal manufacturers and producers. Here the goals are to optimize the crystals for the requirements of the particular experiment as well as mass-production.
Such scintillators are also being used increasingly in the important spin-off sector of medical imaging for a new generation of cameras for nuclear medicine. With clinical studies and diagnostics supplemented by basic biomedical research and pharmacokinetics evaluations, this camera market is expanding rapidly. In the quest for optimal image precision and contrast for minimal patient radiation dose, the requirement for high spatial and time resolution somehow clashes with the need for high sensitivity.
Basic research into energy transport and scintillation mechanisms, initiated at CERN and in the US some 10 years ago, help to define guidelines for the development of new, dense, high-light-yield, fast materials. After years of hit and miss, an element of predictability has emerged. For example, it is no surprise that lutetium or gadolinium are the main constituents of most of the new host lattices and that cerium is the favourite luminescence activator for fast scintillation.
Several presentations on lithium- and boron-based scintillators confirmed the rapidly growing interest for new neutron-sensitive materials for use in neutron spallation source projects and the monitoring of nuclear waste.
Seattle also featured the IEEE Medical Imaging Conference (MIC). This meeting evolved from the traditional Nuclear Science Symposium (NSS) a number of years ago and now runs in parallel under a common organizing umbrella.
This reflects the interest in applying nuclear science techniques in medicine. The identification of the MIC as a mature entity within the same framework as NSS allowed the scope of the NSS/MIC to expand, so that today it is not only the most important meeting in the international calendar for new developments in detectors and acquisition systems for the detection of ionizing and non-ionizing radiation in general, but also the main meeting for presenting new nuclear medicine systems along with algorithms for medical image reconstruction. The number of submitted abstracts was more than 500, shared equally between MIC and NSS.
Chaired by Grant Gullberg and Larry Zeng from Utah, the recent MIC began with a joint session, shared with the NSS, on new detectors for measuring gamma radiation. Solid-state detectors have not yet made great inroads in medical imaging devices, but this session showed that this transition is starting to take place.
In general, there was great interest in new detectors. The field of positron emission tomography (PET) is moving into the vast arena of clinical diagnostic application with established inorganic scintillators (NaI(Tl), bismuth germanate – BGO), but at the same time new detector materials are being incorporated into clinical systems.
Lutetium oxyorthosilicate doped with cerium, originally developed by Schlumberger-Doll for geophysical applications, looks certain to be the detector material for annihilation radiation measurements in medicine for the next decade or more. This scintillator has almost the same stopping power as BGO but offers a much faster decay response time.
Presentations on this exciting new scintillator included a report on the first full PET system built using this set-up by one of the main companies in the field (CTI PET Systems, Knoxville, Tennessee) and the Max Planck Institute in Cologne.
Other topics included new scanners that combine traditional anatomical devices – X-Ray computer tomography (CT), magnetic resonance imaging (MRI) – with features from PET and its gamma-emitting counterpart, single photon emission computed tomography, new algorithms for finding the solution of “inverse problems” encountered in medical imaging, and applications of simulation in the design and testing of imaging systems.
A number of papers focused on the problem of measuring the density distribution of the body for use in correcting for the attenuation of photons, and allowing for the unwanted contribution of Compton scattered photons in images. New detectors with faster response times and improved energy resolution may contribute to this area.
While the meeting was dominated by emission tomography (single-photon and positron emitters), there is increasing interest in MRI, spiral X-Ray CT and other medical imaging methods.
Arrangements are well advanced for the IEEE Nuclear Science Symposium and Medical Imaging Conference 2000 in Lyon in October, the first time that this important meeting will come to Europe. In addition to the usual scientific papers, the organizing committee anticipates increased interest from exhibitors of a highly technical nature, and a number of short courses aimed at PhD student/postdoctoral graduate level in topics of current interest in image reconstruction and multimodality imaging. These would also serve as a useful introduction to an experienced scientist or engineer in high-energy physics who might have an interest in applications in medicine. This year’s NSS/MIC Web site is at “http://nss2000.in2p3.fr/“.
In a move that underlines the growing requirement for sophisticated hardware for precision physics experiments in space, NASA has announced an award to Stanford University for the development of the GLAST space-based gamma-ray telescope.
GLAST (Gamma-Ray Large Area Space Telescope) will be built as a collaboration of NASA, the US Department of Energy and specialists in France, Germany, Italy and Japan. It will detect electrons and nuclear particles accelerated to ultrahigh energies beyond those attainable on Earth.
Management of the project will be centred at the Stanford Linear Accelerator Center. The launch is scheduled for 2005.
Expected to have a mission life of five years, GLAST will make great improvements over gamma-ray telescopes, such as EGRET, which is currently aboard the Compton Observatory. EGRET has operated successfully long past its design life and is about to stop. Compared with EGRET, GLAST will have a field of view and an effective area each about six times as large, sensitivity of some 50 times as good and energy of more than 10 times as high. Its wide field of view will enable scientists to probe extreme transient phenomena, such as active galactic nuclei and the mysterious gamma-ray bursts over a range of timescales.
The primary GLAST instrument is a matrix of towers composed of thin lead foil interleaved with thin silicon detectors to record the gamma-ray direction, followed by a matrix of scintillation crystals to measure the gamma ray energy. Using about 100 sq. m of silicon strip detectors, GLAST will be by far the largest silicon-based detector to be launched into space.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.