You are here

Linear Algebra with Applications

Otto Bretscher
Publisher: 
Pearson
Publication Date: 
2013
Number of Pages: 
508
Format: 
Hardcover
Edition: 
5
Price: 
166.00
ISBN: 
9780321796974
Category: 
Textbook
BLL Rating: 

The Basic Library List Committee suggests that undergraduate mathematics libraries consider this book for acquisition.

[Reviewed by
Gary Stoudt
, on
07/16/2013
]

This book contains the standard material usually found in an introduction to linear algebra course in U.S. colleges and universities. It does have some novel features, but it does not overdo it. This is a textbook where an instructor might be able to cover all of the material.

The author does not start out with an overview of vectors and vector algebra (this is relegated to an appendix) and I did not miss it. Dot and cross products are introduced as needed. The book begins with linear systems and their solutions. Chapter 1 also includes the first taste of historical comments that are liberally placed throughout the text. Many sections of the text also include exercises either with an historical bent or from actual original sources. This is one of the novel and very welcome features of this book.

The author considers matrix multiplication Ax as linear combinations of the columns of A, in keeping with current pedagogical practice. I did miss Gaussian elimination as LU factorization and back substitution, but that is a personal preference, I suppose.

Linear transformations are introduced early in Chapter 2, but strictly as matrix operations on the Euclidean spaces \(\mathbb{R}^n\). By using transformations the author can take an early look at geometry (rotations, reflections) and projections (although only on \(\mathbb{R}^2\) and \(\mathbb{R}^3\) for now). This is a nice compromise between the “linear algebra as the study of linear transformations” adherents and the “linear algebra as the study of matrices” followers. We all know that in finite dimensions these are one and the same but the choice of perspective can very much change the nature of an introductory linear algebra course.

The author saves the general notion of vector space (which he calls “linear space” to avoid confusion with ways in which “vector” is used) until Chapter 4. A matrix-only course could skip this chapter with no loss in continuity in the course.

Continuing this compromise approach, kernel and image are introduced for matrices in Chapter 3 and the general notions for linear transformations are also saved until Chapter 4. The properties of the kernel and image serve as prototypes for the definition of subspace that comes later. Linear independence and basis are nicely done through the notion of “redundant” vectors. In addition, orthogonality is introduced in \(\mathbb{R}^n\) first and then in general inner product spaces, allowing Fourier series to make an appearance. The author links orthogonality to least squares and curve fitting and also links Gram-Schmidt to QR factorization, again keeping with more modern approaches in textbooks. While the orthogonality of the subspaces of a matrix and its transpose is mentioned, the idea does not receive the prominence it deserves; it is just as important as the dimension theorem.

The author handles the difficult idea of determinants with a nice approach using “pattern” instead of the difficult (at this level) notion of permutations. This pattern approach works well for sparse matrices, as the author shows, but for less sparse matrices he resorts to using elimination. It is interesting to note that minors and cofactor expansion are listed in an optional section, given their association with Laplace and this author’s nice use of history throughout the text.

As is typical in an introductory text, determinants are used only as a test for invertibility and for finding eigenvalues. This book does not use a determinant free approach to eigenvalues. The main use of eigenvalues and eigenvectors in the book is in differential equations/dynamical systems. Dynamical systems are used as examples and motivation for eigenvalues and eigenvectors, and an additional chapter is devoted to further study of these applications.

Finally, the author covers symmetric matrices and quadratic forms. There are nice applications to conic sections, but many other applications of symmetric matrices are left out. I was pleased, however, to see singular values and the singular value decomposition included, although again applications were wanting.

Each section contains a very nice list of exercises, and these exercises range from basic to challenging. There are typically some easy (but realistic) applications, particularly early on in the book. These early application exercises can be used to help motivate the study of linear algebra beyond solving systems of linear equations.

The author uses a combination of the “definition/theorem/proof” style with a more “conversational” style. Some theorems summarize what was found in earlier examples, some are in the “theorem/proof” style. As is the nature of the subject matter, however, many proofs are merely computational. While there are definitions, theorems, and proofs, they do not appear from nowhere. Each definition or theorem follows after a few motivating examples. Students should have little difficulty reading this text. The author includes historical commentary and problems. These are not an afterthought; history is included at various places throughout the book and I found the author’s historical comments a nice jumping off point for further study. It is by no means an historical text, but the historical material is a very nice addition to a solid introduction to linear algebra.

In summary, this book covers the typical introduction to linear algebra course. It does not suffer from “textbook bloat,” but then again a few of an instructor’s pet topics might be omitted. There is no reliance on technology, so an instructor will need to supply his/her own technology-related material.


Gary Stoudt (gsstoudt@iup.edu) has been in the Mathematics Department at Indiana University of Pennsylvania since 1991.

. Linear Equations

1.1 Introduction to Linear Systems

1.2 Matrices, Vectors, and Gauss-Jordan Elimination

1.3 On the Solutions of Linear Systems; Matrix Algebra

2. Linear Transformations

2.1 Introduction to Linear Transformations and Their Inverses

2.2 Linear Transformations in Geometry

2.3 Matrix Products

2.4 The Inverse of a Linear Transformation

3. Subspaces of Rn and Their Dimensions

3.1 Image and Kernel of a Linear Transformation

3.2 Subspace of Rn; Bases and Linear Independence

3.3 The Dimension of a Subspace of Rn

3.4 Coordinates

4. Linear Spaces

4.1 Introduction to Linear Spaces

4.2 Linear Transformations and Isomorphisms

4.3 The Matrix of a Linear Transformation

5. Orthogonality and Least Squares

5.1 Orthogonal Projections and Orthonormal Bases

5.2 Gram-Schmidt Process and QR Factorization

5.3 Orthogonal Transformations and Orthogonal Matrices

5.4 Least Squares and Data Fitting

5.5 Inner Product Spaces

6. Determinants

6.1 Introduction to Determinants

6.2 Properties of the Determinant

6.3 Geometrical Interpretations of the Determinant; Cramer's Rule

7. Eigenvalues and Eigenvectors

7.1 Diagonalization

7.2 Finding the Eigenvalues of a Matrix

7.3 Finding the Eigenvectors of a Matrix

7.4 More on Dynamical Systems

7.5 Complex Eigenvalues

7.6 Stability

8. Symmetric Matrices and Quadratic Forms

8.1 Symmetric Matrices

8.2 Quadratic Forms

8.3 Singular Values

9. Linear Differential Equations

9.1 An Introduction to Continuous Dynamical Systems

9.2 The Complex Case: Euler's Formula

9.3 Linear Differential Operators and Linear Differential Equations

Appendix A. Vectors

Appendix B: Techniques of Proof

Answers to Odd-numbered Exercises

Subject Index

Name Index