You are here

Linear Models: The Theory and Application of Analysis of Variance

Brenton R. Clarke
Publisher: 
John Wiley
Publication Date: 
2008
Number of Pages: 
241
Format: 
Hardcover
Series: 
Wiley Series in Probability and Statistics
Price: 
90.00
ISBN: 
9780470025666
Category: 
Textbook
[Reviewed by
Ita Cirovic Donev
, on
01/13/2009
]

Linear Models is yet another reference in the field. What would be the proper role for it? There are many books out there that discuss linear models and ANOVA, some good, some not so good. The authors always claim that theirs is something different and, of course, better than the competition. This book does present a different approach. From author’s words:

Where this book differs significantly from most books on ANOVA is in the [discussion] beginning with Helmer matrices and Kronecker products… which allows succinct and explicit forms of contrast that yield both the orthogonal components in ANOVA, including projection matrices, and distributions of component sums of squares with illustrations for a number of designs, including two-way ANOVA, Latin squares, and 2k factorial designs.

So let’s see how well it works.

The author takes some time to introduce the subject, providing a nice first chapter on fixed effects and presenting some common linear models. The next three chapters also, in my opinion, constitute the introduction or preparation to the subject. Hence, we can refresh our mind with some vector space theory and in particular a discussion of orthogonal projections onto subspaces. This then directly leads us to least square theory and the Gauss-Markov theorem. The author does not delve much into discussion but rather presents the important results. To finalize the introduction, some distribution theory is presented.

Given the required prerequisite knowledge of matrix theory, mathematical statistics and probability theory, these four chapters should be easily grasped. Theorems are stated with full proofs, there is no hand-waving. Even some more trivial steps are shown.

Chapter 5 presents orthogonal relationships, including theoretical results for Helmert matrices and Kronecker product. The presentation is detailed and transparent enough to be easily followed and understandable. Some simpler examples are provided, and one more detailed example showing the calculation steps along the way. This easy to follow chapter provides a sound base for the rest of the book. The rest of the book deals with the methods of estimation and fitting as well as discussing robust methods.

The book is pitched at the graduate level, so it is written assuming quite a wide base knowledge of statistics, probability and matrix theory. Proofs are detailed without skipping too many steps. Exercises are provided and accompany the text quite nicely. Overall, it is hard for me to imagine this book as a main text for the course, but as a side reference it should be more than welcomed.


Ita Cirovic Donev holds a Masters degree in statistics from Rice University. Her main research areas are in mathematical finance; more precisely, statistical methods for credit and market risk. Apart from the academic work she does statistical consulting work for financial institutions in the area of risk management.

 

Preface.

Acknowledgments.

1. Introduction.

1.1 Introduction to the Linear Model and Examples.

1.2 What are the objectives?.

1.3 Problems.

2. Projection Matrices and Vector Space Theory.

2.1 Basis of a Vector Space.

2.2 Range and Kernel.

2.3 Projections.

2.3.1 Linear Model Application.

2.4 Sums and Differences of Orthogonal Projections.

2.5 Problems.

3. Least Squares Theory.

3.1 The Normal Equations.

3.2 The Gauss-Markov Theorem.

3.3 The Distribution of S.

3.4 Some Simple Significance Tests.

3.5 Prediction Intervals.

3.6 Problems.

4. Distribution Theory.

4.1 Motivation.

4.2 Non-Central X2 and F Distributions.

4.2.1 Non-Central F-Distribution.

4.2.2 Applications to Linear Models.

4.2.3 Some Simple Extensions.

4.3 Problems.

5. Helmert Matrices and Orthogonal Relationships.

5.1 Transformations to Independent Normally Distributed Random Variables.

5.2 The Kronecker Product.

5.3 Orthogonal Components in Two-Way ANOVA: One Observation Per Cell.

5.4 Orthogonal Components in Two-Way ANOVA with Replications.

5.5 The Gauss-Markov Theorem Revisited.

5.6 Orthogonal Components for Interaction.

5.6.1 Testing for Interaction: One Observation Per Cell.

5.6.2 Example Calculation of Tukey’s 1 Degree of Freedom Statistic.

5.7 Problems.

6. Further Discussion of ANOVA.

6.1 The Different Representations of Orthogonal Components.

6.2 On the Lack of Orthogonality.

6.3 The Relationship Algebra.

6.4 The Triple Classification.

6.5 Latin Squares.

6.6 2k Factorial Designs.

6.6.1 Yates’ Algorithm.

6.7 The Function of Randomization.

6.8 A Brief View of Multiple Comparison Techniques.

6.9 Problems.

7. Residual Analysis: Diagnostics and Robustness.

7.1 Design Diagnostics.

7.1.1 Standardized and Studentized Residuals.

7.1.2 Combining design and residual effects on fit - DFITS.

7.1.3 The Cook-D-Statistic.

7.2 Robust Approaches.

7.2.1 Adaptive Trimmed Likelihood Algorithm.

7.3 Problems.

8. Models Including Variance Components.

8.1 The One-Way Random Effects Model.

8.2 The Mixed Two-Way Model.

8.3 A Split Plot Design.

8.3.1 A Traditional Model.

8.4 Problems.

9. Likelihood Approaches.

9.1 Maximum Likelihood Estimation.

9.2 REML.

9.3 Discussion of Hierarchical Statistical Models.

9.3.1 Hierarchy for the Mixed Model (Assuming Normality).

9.4 Problems.

10. Uncorrelated Residuals Formed from the Linear Model.

10.1 Best Linear Unbiased Error Estimates.

10.2 The Best Linear Unbiased Scalar-Covariance-Matrix Approach.

10.3 An Explicit Solution.

10.4 The Recursive Residuals.

10.4.1 The Recursive Residuals and their Properties.

10.5 Uncorrelated Residuals.

10.5.1 The Main Results.

10.5.2 Final Remarks.

10.6 Problems.

11. Further inferential questions relating to ANOVA.

12. Permissions.

Glossary.

Bibliography.

References.

Topic Index.