Kalman Filtering Theory, 2nd, rev. & enlarged, ed., 1988, 265 pp., ISBN 0-911575-49-9, $72.00
Self-contained treatise on this important subject. Contains
necessary review material on State Space Theory and Signal (Random Process)
Theory. Covers: Maximum Likelihood (and Unconditional Likelihood) Estimation,
Modified Newton-Raphson Algorithm and Cramer-Rao Bound calculation illustrated
by application to Navigation and Tracking; Derivation of the basic Kalman
Filter formulas; Riccati Equation and Error Propagation; Conditions for
Steady State Optimality and associated Frequency Transfer Function theory;
Application to System Identification; Kalman Smoother; Correlated Signal
and Noise; Colored Observation Noise; Likelihood Ratios: Gaussian Signals
in Gaussian Noise; Extended Kalman Filter and application to nonlinear
tracking. Application to Stochastic Control. Problems included for each
topic.
TABLE OF CONTENTS
Preface to the Second Edition ..... ix
Preface to the First Edition ..... xi
Notation ..... xiii
CHAPTER 1. REVIEW OF LINEAR SYSTEM THEORY ..... 1
Time-Invariant Systems (3) -- Stability (4) -- Controllability
(5) -- Observability (7)
CHAPTER 2. REVIEW OF SIGNAL THEORY ..... 11
Spectral Theory of Signals with Finite Power (11) -- Stochastic
Signals: Second-Order Theory (14) -- Signal-Generation Models (17)
CHAPTER 3. STATISTICAL ESTIMATION THEORY ..... 27
3.1. Parameter Estimation: the Cramer-Rao Bound; the Principle of Maximum Likelihood ..... 27
Proof of the C-R Bound Formula (30) -- Principle of Maximum Likelihood (33) -- Application to Navigation (39)3.2. Bayesian Theory of Estimation: Optimal Mean Square Estimates and Conditional Expectation ..... 42
Bayes Formula (45)3.3. Gaussian Distributions: Conditional Density; Unconditional Maximum Likelihood; Mutual Information ..... 45
Calculating the Conditional Density (49) -- Unconditional Maximum Likelihood (51) -- Mutual Information (52)3.4. Gram-Schmidt Orthogonalization, and Covariance Matrix Factorization ..... 53
3.5. Estimation of Signal Parameters in Additive Noise ..... 59
3.6. Performance Degradation Due to Parameter Uncertainty ..... 68
Example 3.6.1: A Tracking Problem: Linear Model (71) -- Example 3.6.1 Continued: Nonlinear Model (77) -- Calculating the MLE: Newton- Raphson Algorithm (81)
CHAPTER 4. THE KALMAN FILTER ..... 83
4.1. Basic Theory ..... 83
One-Step Predictor (94) -- p-Step Predictor(96) -- Signal Estimation Error Covariance (97) -- Fit Error (97) -- Innovation (100)4.2. Kalman Filter: Steady State Theory ..... 106
One-Dimensional Example (123) -- Miscellaneous Remarks (126) -- Suboptimal Filtering (127)4.3. Steady State Theory: Frequency Domain Analysis ..... 134
One-Dimensional Example (134) -- Two-Dimensional (Two-State) Example (137) -- General Case (140)4.4. On-Line Estimation of System Parameters ..... 148
Problem Statement (148) -- On-Line Version (155) -- Application of Kalman Filtering (156) -- Necessary and Sufficient Conditions for Identifiability (162)4.5. (Kalman) Smoother Filter ..... 166
4.6. Kalman Filter: Correlated Signal and Noise ..... 177
Example 4.6.2: Accelerometer Data (185) -- Example 4.6.2 Continued: A Suboptimal Filter (189)4.7. Kalman Filter for Colored (Observation) Noise ..... 192
Time-Invariant Systems: Steady State Theory (195)4.8. Example: Correcting for Vertical Deflection of Gravity ..... 197
Kalman Smoother (203)
CHAPTER 5. LIKELIHOOD RATIOS: GAUSSIAN SIGNALS
IN GAUSSIAN NOISE ..... 207
Likelihood Ratio Using Fit Error (210)
CHAPTER 6. THE EXTENDED KALMAN FILTER ..... 213
6.1. General Theory ..... 214
Existence of Minimum (215) -- Computational Algorithm (216)6.2. The Zero State Noise Case ..... 221
6.3. The General Case ..... 226
CHAPTER 7. LINEAR QUADRATIC STOCHASTIC CONTROL ..... 229
7.1. Statement of the Control Problem ..... 229
Solving the "Separated" Problem: Dynamic Programming Approach (234)7.2. Steady State Theory ..... 239
7.3. A More General Problem ..... 246
Bibliography ..... 249
Index ..... 251