Aficionados of linear algebra are of course familiar with Roger Horn, one of the coauthors of the book now under review; among many other accomplishments, he is also a coauthor of two very respected books on linear algebra, *Matrix Analysis* and *Topics in Matrix Analysis*. The July 2007 issue of the journal *Linear Algebra and its Applications* was dedicated to him. Likewise, Stephan Garcia is also well-known in the mathematical community, both for his mathematical accomplishments and expository skill. So, given the pedigrees of the authors of this book, I approached it with high expectations — perhaps, I feared, unreasonably high ones. In fact, I needn’t have worried. However high my expectations were, this book not only met, but exceeded, them. It’s *that* good.

Unlike, say, a course in basic real analysis, there doesn’t seem to be much of a consensus on what should be taught in a course with a title like “Advanced Linear Algebra” or “Linear Algebra II”. A number of questions must be considered, including but not limited to: Do you emphasize linear transformations or matrices? Do you work over arbitrary fields? Do you discuss infinite dimensional spaces, or limit yourself to finite dimensions? Do you discuss significant applications? How much, if any, topics from numerical linear algebra do you cover? There are no objectively “right” answers to these questions; it’s all a matter of personal taste.

Precisely because of this, there are a number of books on the subject, all with different personalities. (See, e.g., the review of Shapiro’s *Linear Algebra and Matrices: Topics for a Second Course*, and the half-dozen other books cited there.) To Garcia and Horn, a second course in linear algebra should be strongly matrix-oriented (people who are looking for a baby course in functional analysis should definitely look elsewhere) and broadly accessible, with minimal prerequisites (the only real post-calculus prerequisites here are a first course in linear algebra and some experience with proofs). As we will shortly see, this book starts from near scratch, avoids sophisticated algebraic discussions and ideas (vector spaces are real or complex, not over arbitrary fields; although the standard matrix groups appear in the text, group-theoretic terminology is deliberately avoided), and yet nonetheless covers a lot of interesting linear algebra, some of which (based on my experience, anyway) may even be new to the instructor.

In more detail: Chapter 0 is introductory and covers not only the very basic stuff like functions and relations, but also reviews matrices and determinants (without proofs). It’s probably a good idea for even reasonably well-prepared students to glance at this chapter, because it also includes a mention of the Fundamental Theorem of Algebra and a discussion of Lagrange Interpolation, which may be new to some students. Chapters 1 and 2 review (this time with proofs) material on vector spaces and linear transformations. The authors, as noted above, work with vector spaces only over the real and complex numbers (mostly complex), and also work mostly with finite-dimensional spaces: although infinite-dimensional examples are given and occasionally used, the theorems that are proved about vector spaces are generally limited to finite-dimensional ones, thereby avoiding the need for Zornification. In fact, the very concepts of linear independence and dependence are defined only for finite sets, so that a basis, by definition, is always finite.

Chapters 4, 5 and 7 discuss inner product spaces and orthogonality. Many first courses in linear algebra look at inner product spaces, though not generally, I think, to the extent that they are discussed here. In particular, connections with Fourier series are discussed, and the Riesz representation theorem for linear functionals on finite-dimensional inner product spaces is proved. There is also an extensive discussion of orthogonal complements and projections, and Least Squares approximation.

Sandwiched in these chapters is chapter 3, which talks about partitioned matrices; this material may be new to most readers, but its early introduction here is valuable because the authors use it consistently throughout the rest of the text.

The rest of the text discusses topics in matrix theory that will likely not have been covered in a student’s first course. Not surprisingly, eigenvalues and eigenvectors and their surrounding theory play a prominent role, with discussions of (among other things) triangularization, diagonalization, the Jordan canonical form, singular value decomposition and the pseudoinverse of a matrix, and eigenvalue interlacing for Hermitian matrices. Some attention is paid to location and estimation of eigenvalues (Gershgorin discs are discussed, for example, as is Rayleigh-Ritz) but this is not intended as a numerical linear algebra text; a number of topics that one might find in such a book are not present here, including iterative methods.

Other topics covered in the book include unitary, normal and self-adjoint matrices; the polar decomposition; condition numbers; and the norm of a matrix.

Of course, books must have manageable length, and so authors must make difficult choices about what material to *not* put in. In the present case, there are some omissions that I might quibble with, most notably the Perron-Frobenius theory of nonnegative matrices. I was also surprised that, although the term “linear functional” is defined, the notion of the dual space of a vector space is not. I also did not see the definition of a quotient space of a vector space.

Another quibble: the list of references is short, containing only ten items, and with the exception of *Matrix Analysis* and *Topics in Matrix Analysis*, no other textbooks are listed on linear algebra at the level of this book, or slightly beyond. I realize that Macy’s doesn’t advertise for Gimbels, but I think students in a course at this level could profit from glancing at other books, or, even better, accessible articles in journals like the *College Mathematics Journal* or the *Mathematical Gazette*. None of these are listed, either.

These omissions notwithstanding, this text is a treasure trove of interesting facts about matrices, some of which are “folklore” results whose proofs are not easily found, and some of which may not be well-known even to faculty members. For example, the authors state and prove an interesting theorem (they refer to it as Shoda’s theorem) that all square matrices with trace \(0\) are commutators. This result may be surprising to people with a background in Lie algebras or group theory: the Lie algebra of trace-\(0\) \(n\times n\) matrices is the derived algebra of the Lie algebra of *all* \(n\times n\) matrices under the bracket commutator operation; in general, however, the derived algebra of an arbitrary Lie algebra does *not* consist solely of commutators \([x,y]\) but is instead only generated by these elements. Likewise, by analogy to group theory, the commutator subgroup of an arbitrary group does not generally consist only of commutators but is the subgroup generated by them. So it is not necessarily to be expected that any \(n\times n\) matrix with trace \(0\) is automatically a commutator, but in fact that turns out to be the case. I happened to once stumble across this theorem for matrices in a set of lecture notes by William Kahan but, to my knowledge, this theorem does not appear in any standard undergraduate textbook.

In addition, there are results that were new to me. Although I had heard of the QR factorization, for example, I did not know there was also a QS factorization (for unitary matrices). The section on Brauer’s theorem and the Google matrix also contained material that was new to me.

There are a lot of things about this book that make it appealing as a text. For one thing, the authors’ writing style is consistently clear and inviting, and there are many exercises covering a broad range of difficulty. (Solutions are not provided, which I view as another pedagogical plus; a solutions manual is, however, available from the publisher to instructors who adopt the text.) Many books make a point of announcing that prerequisites are minimal, and then go on to cheerfully ignore this; the authors here, however, seem to have made a serious effort to live within the limitations that they have imposed on themselves.

Perhaps a concrete example of this phenomenon should be noted. Like many people, I first saw the norm of a matrix \(A\) defined as the maximum value of the norms \(\|Ax\|\) as \(x\) ranges over all vectors with norm \(1\). Of course, to know that this maximum exists, some knowledge of analysis, which is not a prerequisite for this text, is required; the authors avoid this problem by defining the spectral norm of the matrix \(A\) to be the largest singular value of \(A\), and then proving the standard properties of this norm from that definition. This is an approach that I don’t believe I’ve seen before.

Another appealing aspect of this text is that almost every chapter ends with a section titled “Notes”. These sections are generally short, rarely more than a page long and often just a paragraph, but they provide useful tidbits of information that did not make it into the main text. For example, it was in one of these sections that I learned that there are more than 80 known characterizations of normal matrices.

The large amount of material covered here makes this book valuable as a desk reference as well as a text. Specialists in linear algebra may need the kind of detail and coverage of *Matrix Analysis *or* Topics in Matrix Analysis* but for non-specialists this book may well be sufficient as a source of information for more basic questions that come up involving matrices.

To summarize and conclude: if you are shopping around for a text for a second course in linear algebra, and your idea of a syllabus aligns with those of the authors (matrix oriented, avoidance of reliance on abstract algebra, not much in the way of applications) then this text should definitely be on your definite short list. Even if you’re not teaching such a course, this book is worth a look as a general reference for matrix theory. It’s a very valuable addition to the literature, and is highly recommended.

Mark Hunacek ([email protected]) teaches mathematics at Iowa State University.