You are here

A Matrix Handbook for Statisticians

George A. F. Seber
Publisher: 
John Wiley
Publication Date: 
2007
Number of Pages: 
559
Format: 
Hardcover
Series: 
Wiley Series in Probability and Statistics
Price: 
110.00
ISBN: 
978-0-471-74869-4
Category: 
Monograph
[Reviewed by
Brian Borchers
, on
03/8/2008
]

Matrix analysis is a subject with applications in many other mathematical fields, including applied probability, statistics, optimization, and control theory. This book is a reference on matrix analysis with particular emphasis on the parts of the subject that are most relevant to applications in statistics.

Topics covered include matrix and vector norms, eigenvalues and eigenvectors, the singular value decomposition, generalized inverses, Jacobians of linear transformations, differentiation of functions of matrices, matrix inequalities, nonnegative matrices, positive definite matrices, and circulant, Toeplitz, and Hankel matrices. Most of this material is of general interest in many areas in which matrix analysis is used. Additional chapters on random variables, random matrices, and bounds on random variables will be of more interest to statisticians.

It must be understood that this is a reference work rather than a textbook or monograph. The book consists almost entirely of definitions and statements of theorems. Proofs are not given, but the author provides extensive references to books and papers that include the proofs of each theorem. Readers will need to have previous exposure to matrix analysis at the level of textbooks such as Horn and Johnson (1990) and Searle (2006).

In evaluating a reference work such as this one, the bibliography and index are particularly important. Seber has been very careful to cite sources for all of the results in the book. The reference list is very thorough and up to date. I found the index to be somewhat inadequate, with some obvious terms missing. For example, although Gersgorin's theorem is easily found in the chapter on eigenvalues and eigenvectors, Gersgorin's name does not appear in the index.

Seber's book is similar in aims and style to Marcus and Minc (1964). In comparison with the book by Marcus and Minc, Seber's book is substantially broader in its coverage of topics. It is also much more up to date. Thus Seber's book should supersede Marcus and Minc. This is an authoritative and comprehensive reference that will be useful to researchers who need to use the results of matrix analysis in their work. It would also be a useful addition to the reference collection of any mathematical library.


References:

R. A. Horn and C. R. Johnson. Matrix Analysis. Cambridge University Press. 1990.

M. Marcus and H. Minc. A Survey of Matrix Theory and Matrix Inequalities . Prindle, Weber, and Schmidt, 1964.

S. R. Searle. Matrix Algebra Useful for Statistics . Wiley-Interscience. 2006.


Brian Borchers is a professor of Mathematics at the New Mexico Institute of Mining and Technology. His interests are in optimization and applications of optimization in parameter estimation and inverse problems. 

Preface.

1. Notation.

1.1 General Definitions.

1.2 Some Continuous Univariate Distributions.

1.3 Glossary of Notation.

2. Vectors, Vector Spaces, and Convexity.

2.1 Vector Spaces.

2.1.1 Definitions.

2.1.2 Quadratic Subspaces.

2.1.3 Sums and Intersections of Subspaces.

2.1.4 Span and Basis.

2.1.5 Isomorphism.

2.2 Inner Products.

2.2.1 Definition and Properties.

2.2.2 Functionals.

2.2.3 Orthogonality.

2.2.4 Column and Null Spaces.

2.3 Projections.

2.3.1 General Projections.

2.3.2 Orthogonal Projections.

2.4 Metric Spaces.

2.5 Convex Sets and Functions.

2.6 Coordinate Geometry.

2.6.1 Hyperplanes and Lines.

2.6.2 Quadratics.

2.6.3 Miscellaneous Results.

3. Rank.

3.1 Some General Properties.

3.2 Matrix Products.

3.3 Matrix Cancellation Rules.

3.4 Matrix Sums.

3.5 Matrix Differences.

3.6 Partitioned Matrices.

3.7 Maximal and Minimal Ranks.

3.8 Matrix Index.

4. Matrix Functions: Inverse, Transpose, Trace, Determinant, and Norm.

4.1 Inverse.

4.2 Transpose.

4.3 Trace.

4.4 Determinants.

4.4.1 Introduction.

4.4.2 Adjoint Matrix.

4.4.3 Compound Matrix.

4.4.4 Expansion of a Determinant.

4.5 Permanents.

4.6 Norms.

4.6.1 Vector Norms.

4.6.2 Matrix Norms.

4.6.3 Unitarily Invariant Norms.

4.6.4 M,N-Invariant Norms.

4.6.5 Computational Accuracy.

5. Complex, Hermitian, and Related Matrices.

5.1 Complex Matrices.

5.1.1 Some General Results.

5.1.2 Determinants.

5.2 Hermitian Matrices.

5.3 Skew-Hermitian Matrices.

5.4 Complex Symmetric Matrices.

5.5 Real Skew-Symmetric Matrices.

5.6 Normal Matrices.

5.7 Quaternions.

6. Eigenvalues, Eigenvectors, and Singular Values.

6.1 Introduction and Definitions.

6.1.1 Characteristic Polynomial.

6.1.2 Eigenvalues.

6.1.3 Singular Values.

6.1.4 Functions of a Matrix.

6.1.5 Eigenvectors.

6.1.6 Hermitian Matrices.

6.1.7 Computational Methods.

6.1.8 Generalized Eigenvalues.

6.1.9 Matrix Products 103.

6.2 Variational Characteristics for Hermitian Matrices.

6.3 Separation Theorems.

6.4 Inequalities for Matrix Sums.

6.5 Inequalities for Matrix Differences.

6.6 Inequalities for Matrix Products.

6.7 Antieigenvalues and Antieigenvectors.

7. Generalized Inverses.

7.1 Definitions.

7.2 Weak Inverses.

7.2.1 General Properties.

7.2.2 Products.

7.2.3 Sums and Differences.

7.2.4 Real Symmetric Matrices.

7.2.5 Decomposition Methods.

7.3 Other Inverses.

7.3.1 Reflexive (g12) Inverse.

7.3.2 Minimum Norm (g14) Inverse.

7.3.3 Minimum Norm Reflexive (g124) Inverse.

7.3.4 Least Squares (g13) Inverse.

7.3.5 Least Squares Reflexive (g123) Inverse.

7.4 Moore-Penrose (g1234) Inverse.

7.4.1 General Properties.

7.4.2 Sums.

7.4.3 Products.

7.5 Group Inverse.

7.6 Some General Properties of Inverses.

8. Some Special Matrices.

8.1 Orthogonal and Unitary Matrices.

8.2 Permutation Matrices.

8.3 Circulant, Toeplitz, and Related Matrices.

8.3.1 Regular Circulant.

8.3.2 Symmetric Regular Circulant.

8.3.3 Symmetric Circulant.

8.3.4 Toeplitz Matrix.

8.3.5 Persymmetric Matrix.

8.3.6 Cross-Symmetric (Centrosymmetric) Matrix.

8.3.7 Block Circulant.

8.3.8 Hankel Matrix.

8.4 Diagonally Dominant Matrices.

8.5 Hadamard Matrices.

8.6 Idempotent Matrices.

8.6.1 General Properties.

8.6.2 Sums of Idempotent Matrices and Extensions.

8.6.3 Products of Idempotent Matrices.

8.7 Tripotent Matrices.

8.8 Irreducible Matrices.

8.9 Triangular Matrices.

8.10 Hessenberg Matrices.

8.11 Tridiagonal Matrices.

8.12 Vandermonde and Fourier Matrices.

8.12.1 Vandermonde Matrix.

8.12.2 Fourier Matrix.

8.13 Zero-One (0,1) Matrices.

8.14 Some Miscellaneous Matrices and Arrays.

8.14.1 Krylov Matrix.

8.14.2 Nilpotent and Unipotent Matrices.

8.14.3 Payoff Matrix.

8.14.4 Stable and Positive Stable Matrices.

8.14.5 P-Matrix.

8.14.6 Z- and M-Matrices.

8.14.7 Three-Dimensional Arrays.

9. Non-Negative Vectors and Matrices.

9.1 Introduction.

9.1.1 Scaling.

9.1.2 Modulus of a Matrix.

9.2 Spectral Radius.

9.2.1 General Properties.

9.2.2 Dominant Eigenvalue.

9.3 Canonical Form of a Non-negative Matrix.

9.4 Irreducible Matrices.

9.4.1 Irreducible Non-negative Matrix.

9.4.2 Periodicity.

9.4.3 Non-negative and Non-positive Off-Diagonal Elements.

9.4.4 Perron Matrix.

9.4.5 Decomposable Matrix.

9.5 Leslie Matrix.

9.6 Stochastic Matrices.

9.6.1 Basic Properties.

9.6.2 Finite Homogeneous Markov Chain.

9.6.3 Countably Infinite Stochastic Matrix.

9.6.4 Infinite Irreducible Stochastic Matrix.

9.7 Doubly Stochastic Matrices.

10. Positive Definite and Non-negative Definite Matrices.

10.1 Introduction.

10.2 Non-negative Definite Matrices.

10.2.1 Some General Properties.

10.2.2 Gram Matrix.

10.2.3 Doubly Non-negative Matrix.

10.3 Positive Definite Matrices.

10.4 Pairs of Matrices.

10.4.1 Non-Negative or Positive Definite Difference.

10.4.2 One or More Non-Negative Definite Matrices.

11. Special Products and Operators.

11.1 Kronecker Product.

11.1.1 Two Matrices.

11.1.2 More Than Two Matrices.

11.2 Vec Operator.

11.3 Vec-Permutation (Commutation) Matrix.

11.4 Generalized Vec-Permutation Matrix.

11.5 Vech Operator.

11.5.1 Symmetric Matrix.

11.5.2 Lower Triangular Matrix.

11.6 Star Operator.

11.7 Hadamard Product.

11.8 Rao-Khatri Product.

12. Inequalities.

12.1 Cauchy-Schwarz inequalities.

12.1.1 Real Vector Inequalities and Extensions.

12.1.2 Complex Vector Inequalities.

12.1.3 Real Matrix Inequalities.

12.1.4 Complex Matrix Inequalities.

12.2 H?older?s Inequality and Extensions.

12.3 Minkowski?s Inequality and Extensions.

12.4 Weighted Means.

12.5 Quasilinearization (Representation Theorems).

12.6 Some Geometrical Properties.

12.7 Miscellaneous Inequalities.

12.7.1 Determinants.

12.7.2 Trace.

12.7.3 Quadratics.

12.7.4 Sums and Products.

12.8 Some Identities.

13. Linear Equations.

13.1 Unknown vector.

13.1.1 Consistency.

13.1.2 Solutions.

13.1.3 Homogeneous Equations.

13.1.4 Restricted Equations.

13.2 Unknown Matrix.

13.2.1 Consistency.

13.2.2 Some Special Cases.

14. Partitioned Matrices.

14.1 Schur Complement.

14.2 Inverses.

14.3 Determinants.

14.4 Positive and Non-Negative Definite matrices.

14.5 Eigenvalues.

14.6 Generalized Inverses.

14.6.1 Weak Inverses.

14.6.2 Moore-Penrose Inverses.

14.7 Miscellaneous partitions.

15. Patterned Matrices.

15.1 Inverses.

15.2 Determinants.

15.3 Perturbations.

15.4 Matrices With Repeated Elements and Blocks.

15.5 Generalized Inverses.

15.5.1 Weak Inverses.

15.5.2 Moore-Penrose Inverses.

16. Factorization of Matrices.

16.1 Similarity Reductions.

16.2 Reduction by Elementary Transformations.

16.2.1 Types of Transformation.

16.2.2 Equivalence Relation.

16.2.3 Echelon Form.

16.2.4 Hermite Form.

16.3 Singular Value Decomposition (SVD).

16.4 Triangular Factorizations.

16.5 Orthogonal-Triangular Reductions.

16.6 Further Diagonal or Tridiagonal Reductions.

16.7 Congruence.

16.8 Simultaneous Reductions.

16.9 Polar Decomposition.

16.10 Miscellaneous Factorizations.

17. Differentiation and Finite Differences.

17.1 Introduction.

17.2 Scalar Differentiation.

17.2.1 Differentiation with Respect to t.

17.2.2 Differentiation With Respect to a Vector Element.

17.2.3 Differentiation With Respect to a Matrix Element.

17.3 Vector Differentiation: Scalar Function.

17.3.1 Basic Results.

17.3.2 x=vec X.

17.3.3 Function of a Function.

17.4 Vector Differentiation: Vector Function.

17.5 Matrix Differentiation: Scalar Function.

17.5.1 General Results.

17.5.2 f = trace.

17.5.3 f = determinant.

17.5.4 f = yrs.

17.5.5 f = eigenvalue.

17.6 Transformation Rules.

17.7 Matrix Differentiation: Matrix Function.

17.8 Matrix Differentials.

17.9 Perturbation Using Differentials.

17.10 Matrix Linear Differential Equations.

17.11 Second Order Derivatives.

17.12 Vector Difference Equations.

18. Jacobians.

18.1 Introduction.

18.2 Method of Differentials.

18.3 Further Techniques.

18.3.1 Chain Rule.

18.3.2 Exterior (Wedge) Product of Differentials.

18.3.3 Induced Functional Equations.

18.3.4 Jacobians Involving Transposes.

18.3.5 Patterned Matrices and L-Structures.

18.4 Vector Transformations.

18.5 Jacobians for Complex Vectors and Matrices.

18.6 Matrices with Functionally Independent Elements.

18.7 Symmetric and Hermitian Matrices.

18.8 Skew-Symmetric and Skew-Hermitian Matrices.

18.9 Triangular Matrices.

18.9.1 Linear Transformations.

18.9.2 Nonlinear Transformations of X.

18.9.3 Decompositions With One matrix Skew Symmetric.

18.9.4 Symmetric Y.

18.9.5 Positive Definite Y.

18.9.6 Hermitian Positive Definite Y.

18.9.7 Skew Symmetric Y.

18.9.8 LU Decomposition.

18.10 Decompositions Involving Diagonal Matrices.

18.10.1 Square Matrices.

18.10.2 One Triangular Matrix.

18.10.3 Symmetric and Skew Symmetric Matrices.

18.11 Positive?Definite Matrices.

18.12 Caley Transformation.

18.13 Diagonalizable Matrices.

18.14 Pairs of Matrices.

19. Matrix Limits, Sequences and Series.

19.1 Limits.

19.2 Sequences.

19.3 Asymptotically Equivalent Sequences.

19.4 Series.

19.5 Matrix Functions.

19.6 Matrix Exponentials.

20. Random Vectors.

20.1 Notation.

20.2 Variances and Covariances.

20.3 Correlations.

20.3.1 Population Correlations.

20.3.2 Sample Correlations.

20.4 Quadratics.

20.5 Multivariate Normal Distribution.

20.5.1 Definition and Properties.

20.5.2 Quadratics in Normal Variables.

20.5.3 Quadratics and Chi-squared.

20.5.4 Independence and Quadratics.

20.5.5 Independence of Several Quadratics.

20.6 Complex Random Vectors.

20.7 Regression Models.

20.7.1 V is the Identity Matrix.

20.7.2 V is Positive Definite.

20.7.3 V is Non-negative Definite.

20.8 Other Multivariate Distributions.

20.8.1 Multivariate t-Distribution.

20.8.2 Elliptical and Spherical Distributions.

20.8.3 Dirichlet Distributions.

21. Random Matrices.

21.1 Introduction.

21.2 Generalized Quadratic Forms.

21.2.1 General Results.

21.2.2 Wishart Distribution.

21.3 Random Samples.

21.3.1 One Sample.

21.3.2 Two Samples.

21.4 Multivariate Linear Model.

21.4.1 Least Squares Estimation.

21.4.2 Statistical Inference.

21.4.3 Two Extensions.

21.5 Dimension Reduction Techniques.

21.5.1 Principal Component Analysis (PCA).

21.5.2 Discriminant Coordinates.

21.5.3 Canonical Correlations and Variates.

21.5.4 Latent Variable Methods.

21.5.5 Classical (Metric) Scaling.

21.6 Procrustes Analysis (Matching Configurations).

21.7 Some Specific Random Matrices.

21.8 Allocation Problems.

21.9 Matrix Variate Distributions.

21.10 Matrix Ensembles.

22. Inequalities for Probabilities and Random Variables.

22.1 General Probabilities.

22.2 Bonferroni-Type Inequalities.

22.3 Distribution-Free Probability Inequalities.

22.3.1 Chebyshev-Type Inequalities.

22.3.2 Kolmogorov-Type Inequalities.

22.3.3 Quadratics and Inequalities.

22.4 Data Inequalities.

22.5 Inequalities for Expectations.

22.6 Multivariate Inequalities.

22.6.1 Convex Subsets.

22.6.2 Multivariate Normal .

22.6.3 Inequalities For Other Distributions.

23. Majorization.

23.1 General Properties.

23.2 Schur Convexity.

23.3 Probabilities and Random variables.

24. Optimization and Matrix Approximation.

24.1 Stationary Values.

24.2 Using Convex and Concave Functions.

24.3 Two General Methods.

24.3.1 Maximum Likelihood.

24.3.2 Least Squares.

24.4 Optimizing a Function of a Matrix.

24.4.1 Trace.

24.4.2 Norm.

24.4.3 Quadratics.

24.5 Optimal Designs.

References.

Index.