- Membership
- MAA Press
- Meetings
- Competitions
- Community
- Programs
- Students
- High School Teachers
- Faculty and Departments
- Underrepresented Groups
- MAA Awards
- MAA Grants

- News
- About MAA

Publisher:

American Mathematical Society

Publication Date:

2007

Number of Pages:

252

Format:

Paperback

Series:

Student Mathematical Library 38

Price:

39.00

ISBN:

9780821843338

Category:

Monograph

[Reviewed by , on ]

Allen Stenger

11/20/2007

This is an undergraduate text in statistical filtering. The authors' premise is that there is a need for a mathematically rigorous treatment at this level. I think they have succeeded in producing such a treatment, but the book is still unsatisfying for a couple of reasons:

- The structure of the book makes it hard to tell where you are or where you are going. For example, the subject of this book, filtering, first appears on p. 63, one-quarter of the way through the book (which seems a long time to wait). But we discover there that the key theorem on how to filter was already proved on p. 29, but the authors did not make a point of it.
- There are no applications and few examples. The authors have essentially abstracted all the mathematically-interesting parts of the engineering subject of filtering and produced a pure-math version. I'm not convinced that there is a market for this kind of treatment, although the authors have been teaching such a course at the University of Minnesota for several years.

Here's an example of the type of practical problem that engineers tackle with filtering and prediction. (This is the kind of handy motivational information that you won't find in this book.) Suppose an enemy fighter airplane is flying toward us and we are tracking it on radar so that we can shoot a missile at it. Fighters are fairly small, the radar return is weak, there is lots of radio-frequency noise in the environment, so our measurements of the fighter's position and velocity are poor (corrupted by noise). We probably model the fighter's progress by a simple model, e.g., constant velocity, but there will be small variations due to wind gusts, engine variability, etc. The filtering problem is to remove the noise (deviation from the model and noise in the measurements) to get an estimated position that is "best" in some sense. The prediction problem is to do this and then predict where the fighter will be a little in the future so we know where to aim the missile.

The first three chapters of the book deal with discrete state spaces, mostly because these are mathematically simpler. These chapters are essentially self-contained and assume only a tiny amount of probability theory. The remaining five chapters move to continuous spaces and also handle the prediction problem. These are much more advanced, and the prerequisites probably include familiarity with Fourier transforms and infinite series.

Before adopting this book you should at least look at a couple of others to see what you may be missing, even though they are graduate-level texts and might be too advanced for your purposes. A classic book on the subject is *Optimal Filtering,* by Brian D. O. Anderson and John B. Moore (Prentice-Hall, 1979, reprinted by Dover). This is an engineering textbook but it is admired for its mathematical rigor. A good new text is *Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches,* by Dan Simon (Wiley, 2006). Both are very clearly written, and both continually keep you informed of why you are studying this subject and where you are going.

Allen Stenger is a math hobbyist, library propagandist, and retired computer programmer. He volunteers in his spare time at MathNerds.com , a math help site that fosters inquiry learning. His mathematical interests are number theory and classical analysis.

Preface

Chapter 1. Preliminaries

1. Series

1:1. Sums and series

1:2. Some examples

1:3. The theorems of Fubini and Tonelli

2. Probability concepts

2:1. Random variables

2:2. Expectations of real-valued random variables that are not necessarily discrete

2:3. Second moments, variance, and standard deviation

2:4. Independence

2:5. Some important distributions

3. Conditioning in the discrete case

3:1. Generalities

3:2. Applications in estimating

Chapter 2. Markov chains

1. Random walks

2. Discrete time and space Markov chains

3. Further problems on Markov chains

Chapter 3. Filtering of discrete Markov chains

1. Filtering

2. Parameter estimation for discrete Markov chains

3. Interpolation for discrete Markov chains

3:1. Interpolation equation

3:2. Conjugate equations

3:3. Other problems related to interpolation

4. Prediction for discrete Markov chains

Chapter 4. Conditional expectations

1. L_{2} -spaces

1:1. Case of finite Ω

1:2. General case

2. Definition of conditional expectations

2:1. General case

2:2. The case of Gaussian random variables

3. Conditional expectations and densities

Chapter 5. Filtering of continuous-space Markov chains

1. Filtering of **R**^{d} -valued Markov chains

2. Discrete-time Kalman filter

2:1. One-dimensional case

2:2. Multidimensional case

3. Linear filtering

Chapter 6. Wiener process and continuous time filtering

1. Introduction

2. Definition and simplest properties of the Wiener process

3. Integration against the Wiener process

4. Recalling power series and systems of ODEs

5. Kalman filter in continuous time

Chapter 7. Stationary sequences

1. Definition and simplest properties

2. Spectral densities

3. Filtering stationary sequences

4. The Bochner-Khinchin theorem (optional)

5. The law of large numbers (optional)

6. Spectral representation of stationary sequences (optional)

Chapter 8. Prediction of stationary sequences

1. Some properties of rational spectral densities

2. Predicting one step ahead

3. Predicting many steps ahead

4. Dynamic predicting

Bibliography

Index

- Log in to post comments