- Membership
- MAA Press
- Meetings
- Competitions
- Community
- Programs
- Students
- High School Teachers
- Faculty and Departments
- Underrepresented Groups
- MAA Awards
- MAA Grants

- News
- About MAA

Publisher:

John Wiley

Publication Date:

2006

Number of Pages:

403

Format:

Hardcover

Series:

Wiley Series in Probability and Statistics

Price:

89.95

ISBN:

0470010924

The Basic Library List Committee suggests that undergraduate mathematics libraries consider this book for acquisition.

[Reviewed by , on ]

Ita Cirovic Donev

02/14/2007

The topic of robust statistics has become a vital part of the field as more and more research is conducted on data that in reality do not follow the principles of the normal distribution. We can find such examples of non-normal data in finance, astronomy, biomedical studies, etc.

Reading *Robust Statistics* is truly a good recipe for learning the methods of robust statistics. The authors present the intuition, definitions and theories with sufficient comments, detailed examples throughout the text, plenty of problems of both theoretical and computational nature, and references for further research and reading.

Robust methods are not just described. Rather, the authors lead us to the need to change the existing methods due to the structure of the data at hand. They provide examples to illustrate their comments on why a robust method should be used and what are the results if one uses it.

The text is written in a lucid style and is very easy to follow. Examples are very detailed, presented with appropriate data and figures. The book is also quite technical (I mean you will get the theory in detail; it is not “scary” technical).

Proofs and additional comments are provided in appendices at the end of the chapters. I am usually not too fond of that style of writing but I guess the intention of the authors is that the book should be most accessible to data analysts, applied statisticians and scientists and as such I guess it is a reasonable choice. For the more theoretically inclined, this book could serve as an appetizer, as it will provide the necessary intuition and base knowledge for more advanced texts on robust methods.

This will be a great book for graduate students as well as for applied scientists and data analysts. It can serve very well for self-study, but it would be a better bet to have it as a class book, in particular since applied robust methods are great for group discussions. One should have good knowledge of linear regression, calculus and linear algebra for most parts of the book, and also a good base knowledge of multivariate analysis, time series analysis and methods of generalized linear models for selected chapters.

One can even download a time-limited version of S-PLUS for Windows which is valid for 150 days. The webpage is provided in the book. This is excellent for readers who are serious about completing the book and don’t have the license for S-PLUS as 150 days should be quite enough to grasp the material.

Ita Cirovic Donev is a PhD candidate at the University of Zagreb. She hold a Masters degree in statistics from Rice University. Her main research areas are in mathematical finance; more precisely, statistical mehods of credit and market risk. Apart from the academic work she does consulting work for financial institutions.

**Preface.**

**1. Introduction.**

1.1 Classical and robust approaches to statistics.

1.2 Mean and standard deviation.

1.3 The “three-sigma edit” rule.

1.4 Linear regression.

1.4.1 Straight-line regression.

1.4.2 Multiple linear regression.

1.5 Correlation coefficients.

1.6 Other parametric models.

1.7 Problems.

**2. Location and Scale.**

2.1 The location model.

2.2 M-estimates of location.

2.2.1 Generalizing maximum likelihood.

2.2.2 The distribution of M-estimates.

2.2.3 An intuitive view of M-estimates.

2.2.4 Redescending M-estimates.

2.3 Trimmed means.

2.4 Dispersion estimates.

2.5 M-estimates of scale.

2.6 M-estimates of location with unknown dispersion.

2.6.1 Previous estimation of dispersion.

2.6.2 Simultaneous M-estimates of location and dispersion.

2.7 Numerical computation of M-estimates.

2.7.1 Location with previously computed dispersion estimation.

2.7.2 Scale estimates.

2.7.3 Simultaneous estimation of location and dispersion.

2.8 Robust confidence intervals and tests.

2.8.1 Confidence intervals.

2.8.2 Tests.

2.9 Appendix: proofs and complements.

2.9.1 Mixtures.

2.9.2 Asymptotic normality of M-estimates.

2.9.3 Slutsky’s lemma.

2.9.4 Quantiles.

2.9.5 Alternative algorithms for M-estimates.

2.10 Problems.

**3. Measuring Robustness.**

3.1 The influence function.

3.1.1 *The convergence of the SC to the IF.

3.2 The breakdown point.

3.2.1 Location M-estimates.

3.2.2 Scale and dispersion estimates.

3.2.3 Location with previously computed dispersion estimate.

3.2.4 Simultaneous estimation.

3.2.5 Finite-sample breakdown point.

3.3 Maximum asymptotic bias.

3.4 Balancing robustness and efficiency.

3.5 *“Optimal” robustness.

3.5.1 Bias and variance optimality of location estimates.

3.5.2 Bias optimality of scale and dispersion estimates.

3.5.3 The infinitesimal approach.

3.5.4 The Hampel approach.

3.5.5 Balancing bias and variance: the general problem.

3.6 Multidimensional parameters.

3.7 *Estimates as functionals.

3.8 Appendix: proofs of results.

3.8.1 IF of general M-estimates.

3.8.2 Maximum BP of location estimates.

3.8.3 BP of location M-estimates.

3.8.4 Maximum bias of location M-estimates.

3.8.5 The minimax bias property of the median.

3.8.6 Minimizing the GES.

3.8.7 Hampel optimality.

3.9 Problems.

**4 Linear Regression 1.**

4.1 Introduction.

4.2 Review of the LS method.

4.3 Classical methods for outlier detection.

4.4 Regression M-estimates.

4.4.1 M-estimates with known scale.

4.4.2 M-estimates with preliminary scale.

4.4.3 Simultaneous estimation of regression and scale.

4.5 Numerical computation of monotone M-estimates.

4.5.1 The L1 estimate.

4.5.2 M-estimates with smooth *ψ*-function.

4.6 Breakdown point of monotone regression estimates.

4.7 Robust tests for linear hypothesis.

4.7.1 Review of the classical theory.

4.7.2 Robust tests using M-estimates.

4.8 *Regression quantiles.

4.9 Appendix: proofs and complements.

4.9.1 Why equivariance?

4.9.2 Consistency of estimated slopes under asymmetric errors.

4.9.3 Maximum FBP of equivariant estimates.

4.9.4 The FBP of monotone M-estimates.

4.10 Problems.

**5 Linear Regression 2.**

5.1 Introduction.

5.2 The linear model with random predictors 118

5.3 M-estimates with a bounded *ρ*-function.

5.4 Properties of M-estimates with a bounded *ρ*-function.

5.4.1 Breakdown point.

5.4.2 Influence function.

5.4.3 Asymptotic normality.

5.5 MM-estimates.

5.6 Estimates based on a robust residual scale.

5.6.1 S-estimates.

5.6.2 L-estimates of scale and the LTS estimate.

5.6.3 Improving efficiency with one-step reweighting.

5.6.4 A fully efficient one-step procedure.

5.7 Numerical computation of estimates based on robust scales.

5.7.1 Finding local minima.

5.7.2 The subsampling algorithm.

5.7.3 A strategy for fast iterative estimates.

5.8 Robust confidence intervals and tests for M-estimates.

5.8.1 Bootstrap robust confidence intervals and tests.

5.9 Balancing robustness and efficiency.

5.9.1 “Optimal” redescending M-estimates.

5.10 The exact fit property.

5.11 Generalized M-estimates.

5.12 Selection of variables.

5.13 Heteroskedastic errors.

5.13.1 Improving the efficiency of M-estimates.

5.13.2 Estimating the asymptotic covariance matrix under heteroskedastic errors.

5.14 *Other estimates.

5.14.1 *τ* -estimates.

5.14.2 Projection estimates.

5.14.3 Constrained M-estimates.

5.14.4 Maximum depth estimates.

5.15 Models with numeric and categorical predictors.

5.16 *Appendix: proofs and complements.

5.16.1 The BP of monotone M-estimates with random *X.*

5.16.2 Heavy-tailed x.

5.16.3 Proof of the exact fit property.

5.16.4 The BP of S-estimates.

5.16.5 Asymptotic bias of M-estimates.

5.16.6 Hampel optimality for GM-estimates.

5.16.7 Justification of RFPE*.

5.16.8 A robust multiple correlation coefficient.

5.17 Problems.

**6. Multivariate Analysis.**

6.1 Introduction.

6.2 Breakdown and efficiency of multivariate estimates.

6.2.1 Breakdown point.

6.2.2 The multivariate exact fit property.

6.2.3 Efficiency.

6.3 M-estimates.

6.3.1 Collinearity.

6.3.2 Size and shape.

6.3.3 Breakdown point.

6.4 Estimates based on a robust scale.

6.4.1 The minimum volume ellipsoid estimate.

6.4.2 S-estimates.

6.4.3 The minimum covariance determinant estimate.

6.4.4 S-estimates for high dimension.

6.4.5 One-step reweighting.

6.5 The Stahel–Donoho estimate.

6.6 Asymptotic bias.

6.7 Numerical computation of multivariate estimates.

6.7.1 Monotone M-estimates.

6.7.2 Local solutions for S-estimates.

6.7.3 Subsampling for estimates based on a robust scale.

6.7.4 The MVE.

6.7.5 Computation of S-estimates.

6.7.6 The MCD.

6.7.7 The Stahel–Donoho estimate.

6.8 Comparing estimates.

6.9 Faster robust dispersion matrix estimates.

6.9.1 Using pairwise robust covariances.

6.9.2 Using kurtosis.

6.10 Robust principal components.

6.10.1 Robust PCA based on a robust scale.

6.10.2 Spherical principal components.

6.11 *Other estimates of location and dispersion.

6.11.1 Projection estimates.

6.11.2 Constrained M-estimates.

6.11.3 Multivariate MM- and *τ* -estimates.

6.11.4 Multivariate depth.

6.12 Appendix: proofs and complements.

6.12.1 Why affine equivariance?

6.12.2 Consistency of equivariant estimates.

6.12.3 The estimating equations of the MLE.

6.12.4 Asymptotic BP of monotone M-estimates.

6.12.5 The estimating equations for S-estimates.

6.12.6 Behavior of S-estimates for high *p.*

6.12.7 Calculating the asymptotic covariance matrix of location M-estimates.

6.12.8 The exact fit property.

6.12.9 Elliptical distributions.

6.12.10 Consistency of Gnanadesikan–Kettenring correlations.

6.12.11 Spherical principal components.

6.13 Problems.

**7. Generalized Linear Models.**

7.1 Logistic regression.

7.2 Robust estimates for the logistic model.

7.2.1 Weighted MLEs.

7.2.2 Redescending M-estimates.

7.3 Generalized linear models.

7.3.1 Conditionally unbiased bounded influence estimates.

7.3.2 Other estimates for GLMs.

7.4 Problems.

**8. Time Series.**

8.1 Time series outliers and their impact.

8.1.1 Simple examples of outliers’ influence.

8.1.2 Probability models for time series outliers.

8.1.3 Bias impact of AOs.

8.2 Classical estimates for AR models.

8.2.1 The Durbin–Levinson algorithm.

8.2.2 Asymptotic distribution of classical estimates.

8.3 Classical estimates for ARMA models.

8.4 M-estimates of ARMA models.

8.4.1 M-estimates and their asymptotic distribution.

8.4.2 The behavior of M-estimates in AR processes with AOs.

8.4.3 The behavior of LS and M-estimates for ARMA processes with infinite innovations variance.

8.5 Generalized M-estimates.

8.6 Robust AR estimation using robust filters.

8.6.1 Naive minimum robust scale AR estimates.

8.6.2 The robust filter algorithm.

8.6.3 Minimum robust scale estimates based on robust filtering.

8.6.4 A robust Durbin–Levinson algorithm.

8.6.5 Choice of scale for the robust Durbin–Levinson procedure.

8.6.6 Robust identification of AR order.

8.7 Robust model identification.

8.7.1 Robust autocorrelation estimates.

8.7.2 Robust partial autocorrelation estimates.

8.8 Robust ARMA model estimation using robust filters.

8.8.1 *τ* -estimates of ARMA models.

8.8.2 Robust filters for ARMA models.

8.8.3 Robustly filtered *τ* -estimates.

8.9 ARIMA and SARIMA models.

8.10 Detecting time series outliers and level shifts.

8.10.1 Classical detection of time series outliers and level shifts.

8.10.2 Robust detection of outliers and level shifts for ARIMA models.

8.10.3 REGARIMA models: estimation and outlier detection.

8.11 Robustness measures for time series.

8.11.1 Influence function.

8.11.2 Maximum bias.

8.11.3 Breakdown point.

8.11.4 Maximum bias curves for the AR(1) model.

8.12 Other approaches for ARMA models.

8.12.1 Estimates based on robust autocovariances.

8.12.2 Estimates based on memory-*m* prediction residuals.

8.13 High-efficiency robust location estimates.

8.14 Robust spectral density estimation.

8.14.1 Definition of the spectral density.

8.14.2 AR spectral density.

8.14.3 Classic spectral density estimation methods.

8.14.4 Prewhitening.

8.14.5 Influence of outliers on spectral density estimates.

8.14.6 Robust spectral density estimation.

8.14.7 Robust time-average spectral density estimate.

8.15 Appendix A: heuristic derivation of the asymptotic distribution of M-estimates for ARMA models.

8.16 Appendix B: robust filter covariance recursions.

8.17 Appendix C: ARMA model state-space representation.

8.18 Problems.

**9. Numerical Algorithms.**

9.1 Regression M-estimates.

9.2 Regression S-estimates.

9.3 The LTS-estimate.

9.4 Scale M-estimates.

9.4.1 Convergence of the fixed point algorithm.

9.4.2 Algorithms for the nonconcave case.

9.5 Multivariate M-estimates.

9.6 Multivariate S-estimates.

9.6.1 S-estimates with monotone weights.

9.6.2 The MCD.

9.6.3 S-estimates with nonmonotone weights.

9.6.4 *Proof of (9.25).

**10. Asymptotic Theory of M-estimates.**

10.1 Existence and uniqueness of solutions.

10.2 Consistency.

10.3 Asymptotic normality.

10.4 Convergence of the SC to the IF.

10.5 M-estimates of several parameters.

10.6 Location M-estimates with preliminary scale.

10.7 Trimmed means.

10.8 Optimality of the MLE.

10.9 Regression M-estimates.

10.9.1 Existence and uniqueness.

10.9.2 Asymptotic normality: fixed X.

10.9.3 Asymptotic normality: random X.

10.10 Nonexistence of moments of the sample median.

10.11 Problems.

**11. Robust Methods in S-Plus.**

11.1 Location M-estimates: function *Mestimate.*

11.2 Robust regression.

11.2.1 A general function for robust regression: *lmRob.*

11.2.2 Categorical variables: functions *as.factor* and *contrasts.*

11.2.3 Testing linear assumptions: function *rob.linear.test.*

11.2.4 Stepwise variable selection: function *step.*

11.3 Robust dispersion matrices.

11.3.1 A general function for computing robust location–dispersion estimates: *covRob.*

11.3.2 The SR-*α* estimate: function *cov.SRocke.*

11.3.3 The bisquare S-estimate: function *cov.Sbic.*

11.4 Principal components.

11.4.1 Spherical principal components: function *prin.comp.rob.*

11.4.2 Principal components based on a robust dispersion matrix: function *princomp.cov.*

11.5 Generalized linear models.

11.5.1 M-estimate for logistic models: function *BYlogreg.*

11.5.2 Weighted M-estimate: function *WBYlogreg.*

11.5.3 A general function for generalized linear models: *glmRob.*

11.6 Time series.

11.6.1 GM-estimates for AR models: function *ar.gm.*

11.6.2 F*τ* -estimates and outlier detection for ARIMA and REGARIMA models: function *arima.rob.*

11.7 Public-domain software for robust methods.

**12. Description of Data Sets.**

**Bibliography.**

**Index.**

- Log in to post comments