Johnston and kramer 1995 reported the feasibility and better performance of the robust estimators as the objective function in the data reconciliation. Observability and redundancy classification in process. This paper describes a data reconciliation and gross error detection software rage to convert raw data into useful plant information for performance analysis. Purchase data reconciliation and gross error detection 1st edition. The current paper presents a method to identify and estimate gross errors for linear dynamic systems using polynomial approximation.
If gross errors are also present in the process data, they must be identified and removed by discarding the corresponding mea surements before reconciliation. Introduction to data reconciliation and gross error diagnosis process data conditioning methods in any modern chemical plant, petrochemical process or refinery, hundreds or even thousands of variables such as flow rates, temperatures, pressures, levels, compositions, etc. Software implementing data reconciliation, like other software, must have a usable gui for model development and end users, and effective data integration to get the sensor data. Indirect evaluation by combining several measurements. The portal can access those files and use them to remember the users data, such as their chosen settings screen view, interface language, etc. Without the availability of constraints as a countercheck of the measurements, gross error detection cannot be carried out. Performance studies of the measurement test for detection. Gross error management as mentioned above, two different approaches have been considered to deal with the problem of gross errors in data reconciliation, in addition to a new mixed one. Application of data reconciliation and gross error. Data filtering, data compression and the impact of measurement selection on data reconciliation are also exhaustively explained. Several new algorithms for the detection of gross errors in process data are presented and applied to an industrial steam. In its most general form, data reconciliation is a minimization of measurement errors subject to satisfying the constraints of the process model.
Individual measurement validation and steady state detection is not treated in this thesis. Pdf simultaneous robust data reconciliation and gross. With the information technology applied widely to process industry, a large amount of historical data which could be used for obtaining the prior prob. Data reconciliation and gross error detection plays an important role in enhanced process operation analysis since measured signals of process variables are often contaminated by measurement errors that include random and gross errors. Finally, a bias error detection metric was proposed, tested and analyzed using the chi. For this reason, only one value per instrument is considered and practitioners are forced to make daily averages of gathered data. Data filtering, data compression and the impact of measurement selection on data reconciliation are also e.
Detection of gross errors data reconciliation deals with the problem of random errors. Data reconciliation and gross error detection core. Data reconciliation and gross error detection in chemical. From a statistical point of view the main assumption is that no systematic errors exist in the set of measurements, since they may bias the reconciliation results and reduce the robustness of the reconciliation. Early work in dynamic data reconciliation is rooted in the problem of process state estimation using the concept of filtering. Data reconciliation and gross error detection 1st edition elsevier. Data reconciliation cannot be used in process development only, but it is a useful tool in model improvement too. Reconciliation of measure ments is usually based on weighted least squares estimation under constraints, and detection of gross errors is based on the residuals obtained in the reconciliation step. This book provides a systematic and comprehensive treatment of the variety of methods available for applying data reconciliation techniques. This paper presents a method to identify and estimate gross errors in plant linear dynamic data reconcili ation.
Use the link below to share a fulltext version of this article with your friends and colleagues. Parameter estimation is the step after data reconciliation in which the reconciled values of the process. The main assumption in all commercial data reconciliation is that measurement values correspond to steady state. Theory and practice of simultaneous data reconciliation. Data are deemed unreliable for process optimization. Data reconciliation an overview sciencedirect topics. The method was based on data reconciliation for the process flow rates and concentrations and was applied to the juice concentration sector. Data reconciliation and gross error detection for troubleshooting. Gross error detection and data reconciliation using. Pdf data reconciliation dr and gross error detection are two common tools used in industry to provide accurate and reliable data, which is useful to.
Pdf data reconciliation and gross error detection for. Pdf problem statement basic statistical tests for gross error. Both developed tools were used to analysethe data set collected from the refinerys. Simultaneous gross error detection and data reconciliation. A study of gross error detection and data reconciliation. In this doctoral research work, massbalancebased data reconciliation is applied to get more out of the data collected at wastewater treatment plants, both in terms of data accuracy as well as. Data reconciliation and gross error detection applied to. Lately, the problem is addressed using model based data smoothing. Data reconciliation and gross error analysis of self. Measurement, redundancy, uncertainty, validation, error detection, error.
This is generally done prior to the final data reconciliation run for a given data set, often included in the overall data reconciliation process. Measurements made on stream flows in a chemical process network are expected to satisfy mass and energy balance equations in the steady state. A number of algorithms that have appeared in the literature are also applied to the steam. Data reconciliation is the process of matching records across different databases.
Correntropy based data reconciliation and gross error. Data reconciliation is a technique that targets at correcting measurement errors that are due to measurement noise, i. Data reconciliation and gross error detection 1st edition. All of them will be summarized briefly in this section. Data provided from measurement are affected by errors arising from uncertainty of measurement imprecision of measuring instruments. Detection of gross errors in data reconciliation by principal component analysis article pdf available in aiche journal 417. These methods are applied to a small stream mixing process as an example. In this paper we shall consider the gross errors to be associated with the measurements rather than. Data reconciliation and gross error detection in a. The time of occurrence of a gross error can be detected either by a global test, a sequential probability ratio test, or by the generalized likelihood ratio test. A data reconciliation problem begins with the acquisition. Robust data reconciliation and gross error detection. The efficacy of data reconciliation and gross error detection depends significantly upon location of measured variables. Theory and practice of simultaneous data reconciliation and gross.
Data reconciliation and gross error detection sciencedirect. Steadystate identification, gross error detection, and. This seco nd tool is only able to solve data reconciliation problems, so it is targeted for sets of data where there are exclusively random errors. Chapter 4, 5, and 6 are case studies where the modeling framework. Data reconciliation and gross error detection in chemical process networks. Because of the presence of random and possibly gross e. Furthermore, these methodsapproaches detect partial gross errors, so it is not required to assume that gross errors exist in the entire data set. Even if the data reconciliation is carried out, the adjusted data is ruined when the original measurements include sys tematic gross errors caused by nonrandom events such as. Steady state detection, data reconciliation, and gross error. Optimization algorithms for parameter estimation and data. Gross error modeling and detection in plant linear dynamic.
Data reconciliation provides us more precise variables which give us the opportunity to better optimize and achieve higher economic gains. Since gross errors are only defined with respect to the. Sanchez, data processing and reconciliation for chemical process operation, academic press international, isbn 0125944608 2000. Up to now steadystate data reconciliation and gross error detection began to be applied widely in. Data reconciliation applied to loss identification in the. Comparing the results of data reconciliation with those of gross error detection when one gross error is included in six measures, the error reduction by gedsl was 69. Gross error detection and data reconciliation in steam. American society for quality northwestern university. A mass balance was then applied to the subsequent sectors crystallization, centrifugation, and drying with the purpose of calculating the sugar production and comparing it with the real production. For repeated samples, the breakdown point is the fraction of gross errors in the data that can be tolerated and the estimator gives a meaningful value. Pdf sensor measurements in a process network inherently contain random andor gross errors. Data reconciliation and gross error detection in chemical process. Part of thechemical engineering commons,industrial engineering.
Pdf detection of gross errors in data reconciliation by. Data reconciliation and gross error detection by dr. Data reconciliation requires joining on fields that have traditionally been nonkey fields. Data reconciliation data filtering and reconciliation has been used for several years as means of obtaining accurate and consistent data in process plants. Finally, the performance of the proposed methods is veri ed through various simulation studies and. Data errors due to miscalibrated or faulty sensors or just random events nonrepresentative of the underlying statistical distribution can induce heavy biases in parameter estimates and reconciled data. Robust estimators and exploratory statistical methods for the detection of gross errors as the data reconciliation is performed are discussed. Gross error detection by measurement test in addition to the random measurement errors the raw process data may also contain grm errors which are caused by nonrandom events. Processing of online data for use with steadystate models requires identification of the existence of a steady state in a process, detection of the presence of gross errors, if any, and data reconciliation to eliminate random measurement noise. Gross error in chemical plants and refineries for online. A knowledgebased approach for data reconciliation of. Numerical and performance comparisons were made among 9 optimization methods.
110 1145 1435 51 1200 958 692 392 137 1387 392 363 1159 706 214 705 623 1357 835 1297 763 70 1027 388 424 1428 1487 86 567