Data reconciliation for process flow
Résumé
Before improving the control of a plant, we must make sure of information coherence issued from instrument lines or sensors. In fact, these informations can be corrupted by errors and can also deviate from the optimum functioning range. The operator must make precautions not to be outside of this range. The detection of errors is used to point out the deviations. The detection, the location, the different error characterizations and the estimation of true values are steps in the data reconciliation problem. Process measurements are subject to two types of errors: first random errors generally taken to be independent and gaussian with zero mean and secondly gross errors which are caused by non-random events such as malfunctioning sensors, instrument biases and inexact process models. Various methods for detection and location of gross errors in process data have been proposed in recent years including the parity space approach, the standardized least square residuals approach and the standardized imbalance residuals approach. Process data reconciliation and their relationship to process monotoring have been the subject of many publications. For recent ones, see for example, Mah (1982), Gertler (1988) or Ragot (1990). In this survey, the authors try to summarize the various aspects of data reconciliation, to point out the main diffculties and to present the state of the art in this field. The authors present the steps of the data reconciliation problem in the following order: techniques of data reconciliation, classification of the data by the observability concept, gross error detection and localisation, variance of measurement error estimation.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...