Diagnosis and Control M - module 2 (30 hours)

(Master in Automation Engineering)

Contents
This part of the course is mainly focused on model-based fault detection and isolation techniques.
Some hints on signal-based methods: limit checking, trend checking, frequency domain methods. Statistical hypotheses testing. The Neyman-Pearson lemma. Test between two simple hypotheses. Test of a simple hypothesis versus a composite one. Test of two composite hypotheses: the generalized likelihood ratio test. Sequential probability ratio test. The Wald test.
Parameter estimation-based approaches: the least squares. The least squares estimation and its geometrical interpretation. Examples. Least squares identification of AR and ARX models. Residual signals based on the parameter estimates. Model validation and model order selection: statistical tests on residuals. Grey-box identification: relation between estimated parameters and physical ones and its use in fault diagnosis. Recursive least squares algorithms. Weighted least squares.
Diagnostic observers. The Luenberger observer. Reduced-order Luenberger observers and their use in instrument fault detection and isolation (IFD). Residual generation and evaluation for IFD.
The parity equations approach. Primary and secondary (enhanced) residual sets. Fault isolation by means of enhanced residuals. Weak isolation and strong isolation. Disturbance decoupling. Parity equations from state-space models.
Optimal state estimation in a stochastic framework: the Kalman filter. Derivation of the Kalman filter equations. Kalman filter properties. Kalman predictor. The difference Riccati equation and its convergence properties. The innovation and its properties. Some extensions: nonzero mean white noises, colored noises, time-varying models. Fault detection based on the innovation of the Kalman predictor. Isolation of output sensor faults: bank of Kalman filters. Fault isolation based on the Wald test: detection of changes in the mean and covariance of process and output noise.