Recursive Estimation and Time-Series Analysis An Introduction

This book has grown out of a set of lecture notes prepared originally for a NATO Summer School on "The Theory and Practice of Systems ModelLing and Identification" held between the 17th and 28th July, 1972 at the Ecole Nationale Superieure de L'Aeronautique et de L'Espace. Since...

Full description

Bibliographic Details
Main Author: Young, Peter C.
Format: eBook
Language:English
Published: Berlin, Heidelberg Springer Berlin Heidelberg 1984, 1984
Edition:1st ed. 1984
Series:Communications and Control Engineering
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
LEADER 09150nmm a2200373 u 4500
001 EB000676035
003 EBX01000000000000000529117
005 00000000000000.0
007 cr|||||||||||||||||||||
008 140122 ||| eng
020 |a 9783642823367 
100 1 |a Young, Peter C. 
245 0 0 |a Recursive Estimation and Time-Series Analysis  |h Elektronische Ressource  |b An Introduction  |c by Peter C. Young 
250 |a 1st ed. 1984 
260 |a Berlin, Heidelberg  |b Springer Berlin Heidelberg  |c 1984, 1984 
300 |a XVI, 300 p  |b online resource 
505 0 |a 1. Introduction -- 2. Recursive Estimation: A Tutorial Introduction -- 2.1 Recursive Estimation of the Mean Value of a Random Variable -- 2.2 Recursive Least Squares Estimation for a Single Unknown Parameter in a Regression Relationship -- 2.3 Summary -- 3. Recursive Estimation and Stochastic Approximation -- 3.1 The Recursive Least Squares Algorithm -- 3.2 Connection with Stochastic Approximation -- 3.3 Some Extensions to Stochastic Approximation -- · Matrix gain SA and optimum algorithms -- · Continuous-time algorithms -- · Search algorithms -- · Acceleration of convergence -- 3.4 Summary -- 4. Recursive Least Squares Regression Analysis -- 4.1 The RLS Algorithm for the General Linear Regression Model -- 4.2 Some Cautionary Comments: Multiple Collinearity and Errors-in-variables -- · Multiple collinearity -- · Errors-in-variables and the structural model -- 4.3 Summary -- 5. Recursive Estimation of Time-Variable Parameters in Regression Models --  
505 0 |a 8. Optimum Instrumental Variable Methods of Time-Series Model Estimation -- 8.1 The Maximum-Likelihood Method -- 8.2 IV Within the Context of Maximum Likelihood -- 8.3 The AML Method Within the Context of Maximum Likelihood -- 8.4 A Refined Recursive IV-AML Approach to Time-Series Analysis -- 8.5 The Statistical Properties of the Refined IVAML Estimates -- 8.6 Performance of the Refined IVAML Algorithms -- 8.7 A Practical Example: Analysis of Time-Series Tracer Data in Translocation Studies -- 8.8 The Optimum Generalized Equation Error (OGEE) Approach to Time-Series Analysis -- · For various common time-series model forms -- · For a multiple input, single output (MISO) system model -- 8.9 Summary -- 9. Alternative Recursive Approaches to Time-Series Analysis.-9.1 Prediction Error (PE) Methods -- ·Statistical Properties of PE Estimates for the Transfer Function (TF) Model -- 9.2 The Extended Kalman Filter --  
505 0 |a 6.5 The General Case: The Structural Model -- 6.6 Summary -- 7 The Instrumental Variable (IV) Method of Time-Series Analysis -- 7.1 The Recursive IV Algorithm for Time-Series Models -- 7.2 The Recursive-Iterative IV Algorithm -- 7.3 Estimation of the Noise Model Parameters: the AML Method -- 7.4 Statistical Properties of the IV and AML Estimates -- · IV estimates -- · AML estimates -- 7.5 Convergence of the IV and AML Estimators -- 7.6 Identifiability and the Choice of Input Signals -- · A special case -- · Choice of input signals -- · Restrictions on the system to be identified -- · The general case -- · Noise process identifiability -- · Some concluding comments on identifiability -- 7.7 Parametric Variations -- 7.8 A Time-Series Analysis Procedure Based on IV-AML Estimation -- 7.9 Representative Practical Results -- · The gas furnace data of Box-Jenkins -- · Rainfall-runoff modeling for the Bedford-Ouse River Basin -- 7.10 Summary --  
505 0 |a 5.1 Shaping the Memory of the Estimator -- · The moving rectangular window -- · The moving exponential window (exponential forgetting factor) -- 5.2 Modelling the Parameter Variations -- 5.3 Vector Measurements and the Kalman Filter-Estimation Algorithm -- · Estimation of the varying mean value for Walgett rainfall data -- 5.4 The Estimation of Rapidly Varying Parameters -- · Simple parameter decomposition -- · A practical example: Missile parameter estimation -- · More complex methods -- 5.5 Simplified ‘Constant Gain’ Algorithms for Time Variable Parameter Estimation -- 5.6 Recursive Smoothing Algorithms -- 5.7 Statistical Methods for Detecting the Presence of Parameter Variation -- 5.8 Summary -- 6. The Time-Series Estimation Problem -- 6.1 The Time-Series Model in the Observation Space -- 6.2 Various Observation Space Model Forms -- 6.3 LeastSquares Estimation: Its Advantages and Limitations -- 6.4 Least Squares Estimation: A Special Case --  
505 0 |a · Conditional probability density function -- · Joint probability density function -- A.1.3 Simple Deterministic Dynamic Systems -- 1. First Order, Continuous-Time Linear Dynamic System -- · Time-constant -- · Steady-state gain -- 2. First Order Discrete-Time Linear Dynamic System -- 3. The Discrete-Time State Space Representation of a Deterministic Dynamic System -- 4. Transfer Function Representation of a Single Input, Single Output (SISO) Discrete Dynamic System -- 5. The Infinite Dimensional, Impulse Response Representation of a Linear SISO Discrete Dynamic System -- 6. Differentiation of a TF with respect to a Given Parameter -- Appendix 2 Gauss’s Derivation of Recursive Least Squares -- Appendix 3 The Instantaneous Cost Function Associated With the Recursive Least Squares Algorithm -- References -- Author Index 
505 0 |a 1. Discrete Random Variables -- · Mean value -- · Variance -- 2. Discrete Random Vectors -- · Joint probability mass function -- · Marginal probability mass function -- · Mean -- · Covariance Matrix -- 3. Conditional Probabilities -- · Conditional probability mass function -- 4. Continuous Random Variables and Vectors -- · Probability Density Function -- 5. The Normal or Gaussian Density Function -- · Normally distributed random variable (scalar) -- · Normally distributed random vector -- 6. Properties of Estimators -- · Unbiased -- · Asymptotically unbiased -- · Minimum variance, unbiased estimators -- · Probability-in-the-limit (p.lim) -- 7. The Likelihood Function andMaximum Likelihood Estimation -- · Hessian matrix -- 8. The Cramer-Rao Lower Bound -- · Information matrix -- 9. Time-Series -- · Mean and variance -- · Covariance and correlation -- · Covariance matrix -- · White noise -- 10. Gauss-Markov Random Sequences --  
505 0 |a · A practical example: stochastic-dynamic model for water quality in the Bedford-Ouse River -- 9.3 Maximum Likelihood Estimation in the State Space -- 9.4 Summary -- 10. Recursive Estimation: A General Tool in Data Analysis and Stochastic Model Building -- 10.1 Pre-processing of Time-Series Data -- 10.2 Model Structure Identification -- 10.3 Model Parameter Estimation -- 10.4 State Estimation -- 10.5 Self-Adaptive (or Self-Tuning) Estimation, Forecasting and Control -- 10.6 Summary -- 11. Epilogue -- Appendix 1 Relevant Mathematical and Statistical Background Material -- A.1.1 Matrix Algebra -- 1. Matrices -- 2. Vectors -- 3. Matrix Addition (or Subtraction) -- 4. Matrix or Vector Transpose -- 5. Matrix Multiplication -- 6. Determinant of a Matrix -- 7. Partitioned Matrices -- 8. Inverse of a Matrix -- 9. Quadratic Forms -- 10. Positive Definite or Semi-Definite Matrices -- 11. The Rank of a Matrix -- 12. Differentiation of Vectors and Matrices -- A.1.2 Statistics and Probability --  
653 |a Control, Robotics, Automation 
653 |a Control engineering 
653 |a Robotics 
653 |a Automation 
041 0 7 |a eng  |2 ISO 639-2 
989 |b SBA  |a Springer Book Archives -2004 
490 0 |a Communications and Control Engineering 
028 5 0 |a 10.1007/978-3-642-82336-7 
856 4 0 |u https://doi.org/10.1007/978-3-642-82336-7?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 629.8 
520 |a This book has grown out of a set of lecture notes prepared originally for a NATO Summer School on "The Theory and Practice of Systems ModelLing and Identification" held between the 17th and 28th July, 1972 at the Ecole Nationale Superieure de L'Aeronautique et de L'Espace. Since this time I have given similar lecture courses in the Control Division of the Engineering Department, University of Cambridge; Department of Mechanical Engineering, University of Western Australia; the University of Ghent, Belgium (during the time I held the IBM Visiting Chair in Simulation for the month of January, 1980), the Australian National University, and the Agricultural University, Wageningen, the Netherlands. As a result, I am grateful to all the reci­ pients of these lecture courses for their help in refining the book to its present form; it is still far from perfect but I hope that it will help the student to become acquainted with the interesting and practically useful concept of recursive estimation. Furthermore, I hope it will stimulate the reader to further study the theoretical aspects of the subject, which are not dealt with in detail in the present text. The book is primarily intended to provide an introductory set of lecture notes on the subject of recursive estimation to undergraduate/Masters students. However, the book can also be considered as a "theoretical background" handbook for use with the CAPTAIN Computer Package