- Membership
- Publications
- Meetings
- Competitions
- Community
- Programs
- Students
- High School Teachers
- Faculty and Departments
- Underrepresented Groups
- MAA Awards
- MAA Grants

- News
- About MAA

Publisher:

Chapman & Hall/CRC

Publication Date:

2010

Number of Pages:

289

Format:

Hardcover

Series:

Monographs on Statistics and Applied Probability 114

Price:

79.95

ISBN:

9781584889212

Category:

Textbook

[Reviewed by , on ]

William J. Satzer

10/21/2010

A time series is a sequence of data points. Typically the data points are arranged according to the time at which they are collected, though certain other kinds of sequential data can be amenable to time series analysis. Usually the data are collected at uniform time intervals. The characteristic property of time series is that the data are not generated independently, and the nature of the dependence is of great interest. Often the data embody trends and have cyclic components. These too are of significant interest. Time series arise in essentially any discipline in which sequences of numerical data are collected. Applications range from medical (electrocardiograms and electroencephalograms) to business and economics (stock prices and interest rates) to meteorology (daily maximum temperatures and precipitation).

Time series analysis consists of methods for exploring sequential data in order to extract meaningful statistics and identify significant characteristics. The goals of the analysis generally include at least one of the following: development of models to understand the stochastic processes underlying the data, prediction of future events based on past events, application of control to influence future values of the data, and identification of persistent signals present in the data.

The current book focuses on the modeling of time series. In roughly the first half of the text, the author introduces the basic ideas and primary tools of time series analysis. A significant part of this is the treatment of autoregressive models, such as ARMA (autoregressive moving average) that represent a time series value as a linear combination of past values and white noise. The challenge with these is first to estimate the order of the AR model (how many past values to use), then to estimate the unknown coefficients. The author describes several approaches, including the standard one that uses the Yule-Walker method and Levinson’s algorithm.

Although both time-domain and frequency-domain approaches to time series modeling are commonly used, the author concentrates on the time domain. Just one chapter discusses spectral analysis and periodiograms. What distinguishes this book from comparable introductory texts is the use of state space modeling. Along with this come a number of valuable tools for recursive filtering and smoothing including the Kalman filter, as well as non-Gaussian and sequential Monte Carlo filters.

As an introduction to the subject, this book has an excessively theoretical emphasis. Although the author provides many graphs of interesting time series from a variety of sources, there are very few examples of computation using those data. Most students learn time series analysis best by applying the techniques they’re learning to real data sets, but they have very little opportunity to see that done here. By contrast, *Time Series Analysis: Forecasting and Control* by Box and Jenkins, a classic text in this field, uses a collection of standard time series data to great effect in order to illustrate individual analysis methods. Another important practical issue is that common errors in modeling and analyzing time series are not considered. Yet that is something that students need to see.

The current book is a useful reference for the application of state space modeling to time series. As an introductory textbook, however, it leaves much to be desired.

Bill Satzer (wjsatzer@mmm.com) is a senior intellectual property scientist at 3M Company, having previously been a lab manager at 3M for composites and electromagnetic materials. His training is in dynamical systems and particularly celestial mechanics; his current interests are broadly in applied mathematics and the teaching of mathematics.

**Introduction and Preparatory Analysis**

Time Series Data

Classification of Time Series

Objectives of Time Series Analysis

Preprocessing of Time Series

Organization of This Book

**The Covariance Function **

The Distribution of Time Series and Stationarity

The Autocovariance Function of Stationary Time Series

Estimation of the Autocovariance Function

Multivariate Time Series and Scatterplots

Cross-Covariance Function and Cross-Correlation Function

**The Power Spectrum and the Periodogram**

The Power Spectrum

The Periodogram

Averaging and Smoothing of the Periodogram

Computational Method of Periodogram

Computation of the Periodogram by Fast Fourier Transform

**Statistical Modeling**

Probability Distributions and Statistical Models

K-L Information and the Entropy Maximization Principle

Estimation of the K-L Information and Log-Likelihood

Estimation of Parameters by the Maximum Likelihood Method

Akaike Information Criterion (AIC)

Transformation of Data

**The Least Squares Method**

Regression Models and the Least Squares Method

Householder Transformation Method

Selection of Order by AIC

Addition of Data and Successive Householder Reduction

Variable Selection by AIC

**Analysis of Time Series Using ARMA Models**

ARMA Model

The Impulse Response Function

The Autocovariance Function

The Relation between AR Coefficients and the PARCOR

The Power Spectrum of the ARMA Process

The Characteristic Equation

The Multivariate AR Model

**Estimation of an AR Model **

Fitting an AR Model

Yule–Walker Method and Levinson’s Algorithm

Estimation of an AR Model by the Least Squares Method

Estimation of an AR Model by the PARCOR Method

Large Sample Distribution of the Estimates

Yule–Walker Method for MAR Model

Least Squares Method for MAR Model

**The Locally Stationary AR Model**

Locally Stationary AR Model

Automatic Partitioning of the Time Interval

Precise Estimation of a Change Point

**Analysis of Time Series with a State-Space Model **

The State-Space Model

State Estimation via the Kalman Filter

Smoothing Algorithms

Increasing Horizon Prediction of the State

Prediction of Time Series

Likelihood Computation and Parameter Estimation for a Time Series Model

Interpolation of Missing Observations

**Estimation of the ARMA Model**

State-Space Representation of the ARMA Model

Initial State of an ARMA Model

Maximum Likelihood Estimate of an ARMA Model

Initial Estimates of Parameters

**Estimation of Trends**

The Polynomial Trend Model

Trend Component Model—Model for Probabilistic Structural Changes

Trend Model

**The Seasonal Adjustment Model**

Seasonal Component Model

Standard Seasonal Adjustment Model

Decomposition Including an AR Component

Decomposition Including a Trading-Day Effect

**Time-Varying Coefficient AR Model**

Time-Varying Variance Model

Time-Varying Coefficient AR Model

Estimation of the Time-Varying Spectrum

The Assumption on System Noise for the Time-Varying Coefficient AR Model

Abrupt Changes of Coefficients

**Non-Gaussian State-Space Model**

Necessity of Non-Gaussian Models

Non-Gaussian State-Space Models and State Estimation

Numerical Computation of the State Estimation Formula

Non-Gaussian Trend Model

A Time-Varying Variance Model

Applications of Non-Gaussian State-Space Model

**The Sequential Monte Carlo Filter**

The Nonlinear Non-Gaussian State-Space Model and Approximations of Distributions

Monte Carlo Filter

Monte Carlo Smoothing Method

Nonlinear Smoothing

**Simulation**

Generation of Uniform Random Numbers

Generation of Gaussian White Noise

Simulation Using a State-Space Model

Simulation with Non-Gaussian Model

**Appendix A: Algorithms for Nonlinear Optimization
Appendix B: Derivation of Levinson’s Algorithm
Appendix C: Derivation of the Kalman Filter and Smoother Algorithms
Appendix D: Algorithm for the Monte Carlo Filter**

**Bibliography**

- Log in to post comments