It’s not uncommon for a long book to have a short title. For example, a thousand-page tome may be titled simply Calculus or Algebra. Richard Farebrother has written a book with the opposite ratio of title words to content pages. His book L1-Norm and L∞-Norm Estimation: An Introduction to the Least Absolute Residuals, the Minimax Absolute Residual and Related Fitting Procedures is only 58 pages long, and five of these pages are blanks separating chapters. This book is part of the series of small books called Springer Briefs in Statistics.
With any small book, one has to ask whether the book is small because it is incomplete or because it is focused. This book falls in the latter category. Its verbose title explains exactly what the book contains. It surveys one focused topic. It does not contain many details but gives an overview and plenty of references. However, one must wonder whether the book should have been published as a survey article.
Although L2 (least squares) estimation has dominated modern statistical practice, L1 and L∞ estimation are older. These methods quickly fell out of use because L2 methods have nice theoretical and above all computational properties. However, L1 and L∞ estimation are being revisited because their theory has been developed and the requisite computations are now tractable, due not only to advances in computing hardware but also to advances in optimization algorithms.
One advantage to L1 methods is that they are very robust to outliers. L1 methods are also related to compressed sensing, an exciting new area of research. It is not as clear what makes L∞ methods attractive. Indeed, such methods can be very non-robust. The Least Median of Squares (LMS) method is a variant on L∞ estimation that is more robust. The author says that LMS has excellent statistical properties and is important in robust statistics. This claim is not supported in the book at hand, though presumably it is in the references.
John D. Cook is an independent consultant and blogs at The Endeavour.