Robust and adaptive methods for track/vertexreconstruction
Robust and adaptive methods for track/vertexreconstruction
Disciplines
Computer Sciences (25%); Mathematics (50%); Physics, Astronomy (25%)
Keywords
-
CLUSTERING,
ROBUST ESTIMATION,
ADAPTIVE ESTIMATION,
CLASSIFICATION
The erncient geometrical reconstruction of tracks and interaction vertices is a crucial step in the analysis of experimental data in high-energy physics. The algorithms employed in this step have to be highly reliable in order not to bias the subsequent physics analysis, but sufficiently fast to allow the processing of a huge number of interactions in nearly real time. In the next generation of experiments at the LHC (Large Hadron Collider at CERN, Geneva) track andvertex reconstruction will be particularly difficult, because of the high interaction rates, the large multiplicity of tracks and interaction vertices, and the high background against which the relevant particle tracks, the primary interaction vertex and eventual secondary vertices have to be found and estimated. Traditionally, the problem has been decomposed into a pattern recognition part, and an estimation part. The aim of this project is to develop and implement adaptive methods which solve both problems in parallel, the estimation being carried out concurrently to the assignment ofhits to tracks or of tracks to vertices. Adaptive methods are expected to be more robust to noise and also faster than conventional methods .They are, however, in general iterative and therefore sensitive invisible particles requires reliable clustering algorithms which are able to classify reconstructed to the starting values of the estimates and the initial assumptions about the assignment. Another aim of the project is therefore the exploration of robust estimators which give sufficiently good starting values in spite ofhigh background. In addition, the reconstruction ofvery short-lived, invisible particles requires reliable clustering algorithms which are able to classify reconstructed tracks according to their production vertices, in order to provide the adaptive estimator with a good initial assignments. The development of suitable clustering methods therefore constitutes an essential part ofthe project. It is to be expected that the optimal choice ofmethod depends on the physical process being studied; finding the fight method under any given circumstances is therefore important. This difficult problem will be tackled, but very likely not be solved entirely within the project. The results ofthe project are intended for use in the CMS experiment which will go into operation at the LHC in 2006. The algorithms will therefore be implemented in the object-oriented CMS
The experiments at the new LHC accelerator at CERN will come into operation in 2007. They will be able to answer questions that have defied the experimenters so far. The data analysis software will be an important part of the experiment. In every collision of the proton beams in the LHC a large number of particles is produced. Some of them are stable, some of them decay very quickly. The particles produced by the collision or by subsequent decays are measured in the detector. The software has to reconstruct the particle tracks from the detector signals and to find the vertices (collision or decay points) where the particles have been produced. To accomplish this we need pattern recognition methods and statistical estimators. In this project we have developed methods that are able to cope with the difficult conditions in the LHC experiments. Because of the extremely high data rate and the large background noise we need algorithms which are at the same time fast, insensitive to the background (robust), and nevertheless precise (statistically efficient). We have found methods for track and vertex estimation that are nearly optimal in the absence of background, but are able to suppress eventual background noise efficiently and automatically. We call these estimators adaptive, because they adapt autonomously to the experimental conditions. All of them are iterative, and their performance may vary strongly with their starting point. We have identified a robust estimator that finds a suitable starting point in all relevant cases. In the project we have also studied and evolved various methods of pattern recognition in order to optimise the search for the decay vertices of short-lived particles. We have shown that the adaptive estimator can also be applied to vertex finding and that it is much faster than traditional methods in this context. It is therefore eminently suitable for deployment in real time event filtering. All methods developed in the course of the project have been implemented in the reconstruction software framework of the CMS experiment, they have been validated and evaluated in the context of LHC physics. A number of software tools have been developed to aid in the validation and evaluation process. Among them are a visualisation of tracks and vertices, a fast simulation tool, and a tool for automatic optimisation of algorithms. Some other experiments have already shown interest in the methods developed in our project. It is envisaged to make the most important ones available in a stand-alone software toolkit. As the current implementation is based entirely on object oriented principles, this should be possible without undue effort.
Research Output
- 1 Publications
-
2003
Title Properties of robust vertex fitting algorithms at high luminosity HEP experiments DOI 10.1109/nssmic.2003.1351832 Type Conference Proceeding Abstract Author D'Hondt J Pages 858-862