In its second edition, Generalized, Linear, and Mixed Models still stands as an authoritative text for various readers. Given its open and clear narrative presentation this book welcomes upper undergraduate as well as graduate students, researchers, and readers in industry in need of such models and methods. The authors provide a detailed presentation of LM, GLM, LMM, GLMM, models for longitudinal data, nonlinear models, etc.
All chapters follow the same structure: a short introduction, theoretical exposition, detailed worked-out examples with accompanying illustrations, inference, and finally the exercises. The introduction is intuitive and provides a solid understanding of what is to come. The theoretical exposition is presented gently so as not to scare anyone away, providing the essential mathematical statements with no theorems and proofs.
One could argue that the book can is suitable for those seeking to learn what they need to do serious applied work. There is so much of a narrative presentation that the math becomes, inevitably, easy to grasp. There are, on average, ten exercises per chapter, most of which ask for proofs. I guess this is where the authors make up for the lack of proofs in the text, so that proof-loving readers do not feel neglected.
As prerequisite for a successful read, one should have some undergraduate statistics knowledge on linear models and some linear algebra or matrix theory. Knowledge of some statistical language and computing software would be a plus, as there are various examples that require data analysis. I would recommend this book to anyone interested in such models, especially practitioners using these methods in real life examples. A book like this one only broadens one’s horizons.
Preface to the First Edition.
1.2 Factors, Levels, Cells, Effects And Data.
1.3 Fixed Effects Models.
1.4 Random Effects Models.
1.5 Linear Mixed Models (Lmms).
1.6 Fixed Or Random?
1.8 Computer Software.
2. One-Way Classifications.
2.1 Normality And Fixed Effects.
2.2 Normality, Random Effects And MLE.
2.3 Normality, Random Effects And REM1.
2.4 More On Random Effects And Normality.
2.5 Binary Data: Fixed Effects.
2.6 Binary Data: Random Effects.
3. Single-Predictor Regression.
3.2 Normality: Simple Linear Regression.
3.3 Normality: A Nonlinear Model.
3.4 Transforming Versus Linking.
3.5 Random Intercepts: Balanced Data.
3.6 Random Intercepts: Unbalanced Data.
3.7 Bernoulli - Logistic Regression.
3.8 Bernoulli - Logistic With Random Intercepts.
4. Linear Models (LMs).
4.1 A General Model.
4.2 A Linear Model For Fixed Effects.
4.3 Mle Under Normality.
4.4 Sufficient Statistics.
4.5 Many Apparent Estimators.
4.6 Estimable Functions.
4.7 A Numerical Example.
4.8 Estimating Residual Variance.
4.9 Comments On The 1- And 2-Way Classifications.
4.10 Testing Linear Hypotheses.
4.11 T-Tests And Confidence Intervals.
4.12 Unique Estimation Using Restrictions.
5. Generalized Linear Models (GLMs).
5.2 Structure Of The Model.
5.3 Transforming Versus Linking.
5.4 Estimation By Maximum Likelihood.
5.5 Tests Of Hypotheses.
5.6 Maximum Quasi-Likelihood.
6. Linear Mixed Models (LMMs).
6.1 A General Model.
6.2 Attributing Structure To VAR(y).
6.3 Estimating Fixed Effects For V Known.
6.4 Estimating Fixed Effects For V Unknown.
6.5 Predicting Random Effects For V Known.
6.6 Predicting Random Effects For V Unknown.
6.7 Anova Estimation Of Variance Components.
6.8 Maximum Likelihood (Ml) Estimation.
6.9 Restricted Maximum Likelihood (REMl).
6.10 Notes And Extensions.
6.11 Appendix For Chapter 6.
7. Generalized Linear Mixed Models.
7.2 Structure Of The Model.
7.3 Consequences Of Having Random Effects.
7.4 Estimation By Maximum Likelihood.
7.5 Other Methods Of Estimation.
7.6 Tests Of Hypotheses.
7.7 Illustration: Chestnut Leaf Blight.
8. Models for Longitudinal data.
8.2 A Model For Balanced Data.
8.3 A Mixed Model Approach.
8.4 Random Intercept And Slope Models.
8.5 Predicting Random Effects.
8.6 Estimating Parameters.
8.7 Unbalanced Data.
8.8 Models For Non-Normal Responses.
8.9 A Summary Of Results.
9. Marginal Models.
9.2 Examples Of Marginal Regression Models.
9.3 Generalized Estimating Equations.
9.4 Contrasting Marginal And Conditional Models.
10. Multivariate Models.
10.2 Multivariate Normal Outcomes.
10.3 Non-Normally Distributed Outcomes.
10.4 Correlated Random Effects.
10.5 Likelihood Based Analysis.
10.6 Example: Osteoarthritis Initiative.
10.7 Notes And Extensions.
11. Nonlinear Models.
11.2 Example: Corn Photosynthesis.
11.3 Pharmacokinetic Models.
11.4 Computations For Nonlinear Mixed Models.
12. Departures From Assumptions.
12.2 Misspecifications Of Conditional Model For Response.
12.3 Misspecifications Of Random Effects Distribution.
12.4 Methods To Diagnose And Correct For Misspecifications.
13.2 Best Prediction (BP).
13.3 Best Linear Prediction (BLP).
13.4 Linear Mixed Model Prediction (BLUP).
13.5 Required Assumptions.
13.6 Estimated Best Prediction.
13.7 Henderson’s Mixed Model Equations.
14.2 Computing Ml Estimates For LMMs.
14.3 Computing Ml Estimates For GLMMs.
14.4 Penalized Quasi-Likelihood And Laplace.
Appendix M: Some Matrix Results.
M.1 Vectors And Matrices Of Ones.
M.2 Kronecker (Or Direct) Products.
M.3 A Matrix Notation.
M.4 Generalized Inverses.
M.5 Differential Calculus.
Appendix S: Some Statistical Results.
S.2 Normal Distributions.
S.3 Exponential Families.
S.4 Maximum Likelihood.
S.5 Likelihood Ratio Tests.
S.6 MLE Under Normality.