Most of us would profit from browsing in a book like this one. Written by a physicist, this account of "essential and advanced mathematics" offers us a glimpse of how well we actually communicate mathematics in our teaching, and also of how some of the most serious users of mathematics think about the subject.

The author's attitude is, perhaps, indicated by the dedication. It says:

This book is dedicated to Mr. McGuire of the former Seaforth Technical College in Sydney, who, when presented with a class of unsuccessful high school students, rekindled our interest in mathematics by telling us which procedures were a "WOT" (waste of time) and which were worth knowing.

This leads to a very different "take" on mathematics than the one that we, perhaps, would like our students to learn. Mathematics, for this author, is essentially a tool to be used as necessary. Ideas that don't impinge on the actual use of mathematics, however beautiful, are of minor interest. This, of course, is a perfectly reasonable attitude, and one that teachers of mathematics should ponder.

Fischer-Cripps' book is divided into two sections, on "Essential Mathematics" and "Advanced Mathematics." The "essential" part mainly deals with calculus and pre-calculus, from trigonometry and analytic geometry to infinite series. It is rounded out by two short sections on probability and on matrices. The "advanced" section is mostly multivariable calculus and differential equations, with a little bit at the end on complex-valued functions and numerical methods. The treatment throughout is the same: a very sketchy account of the main ideas, heavy on formulas and light on words, enlivened by a few examples here and there, mostly from physics.

Three things seemed notable to me. First, this author is not very interested in "definitions" in the mathematical sense. He is more interested in having multiple descriptions of the same concept, together with (usually symbolic or computational) examples. But when no description is available, he'll settle for a formula and rules for how and when to use it. Second, the are many more derivations here than I would have expected (they are not, of course, called "proofs"). Most of the time, they are straightforward computations. If necessary, mathematical niceties are ignored. Third, the author gives far less prominence to computers and computer algebra systems than I would have done. There are a lot of discussion here on how to do things by hand (for example, curve sketching) that most students today learn to do on a computer.

Every so often, there is something that is just weird. When discussing complex numbers, the author says that "for mathematical reasons, the symbol *i* can be seen to be equal to the square root of -1." In the section on limits, he says that "the concept of a limit is partly a philosophical one and an understanding of the nature of limits is often something that takes time and experience to achieve." Despite this, the next page gives the formal definition of limit, both symbolically and expressed in words. I have no idea what use readers will find for this definition. In several places, I couldn't see the point of including as much detail as the author does; at other places, I felt he was far too sketchy. It's unclear to me whether the author had a system in mind, or whether he just made these decisions by instinct.

One finds throughout the book the same reification of notation that I see in my students. The author thinks that since dy/dx = f'(x) it is "evident" that dy = f'(x)dx. At one point he says that something is a differential equation because it is an equation that contains differentials. Both of these statements have a noble history in mathematics, but they are certainly not what we try to teach!

Several topics are included that I don't think are mathematics at all, such as the Schrödinger equation and the electric field. The theorems of vector analysis are treated as a mix of physics and mathematics (which may not, after all, be unreasonable). And some of the summaries, such as those on probability or Fourier transforms, strike me as useless unless the reader already knows the subject.

We can learn something from all this. We can learn some good things, such as the importance of descriptions rather than definitions. We can also be made aware of some problems, such as the fact that many of the things we spend so much time on can be viewed by our students as a WOT. Most of all, however, we can learn more about how our students think.

Should we tell our students to buy this book? I'm of two minds on that. On the one hand, I have a puritannical streak in me that says they should be learning the ideas and the concepts, and that this kind of summary will militate against that. On the other hand, I have a sneaking suspicion that they would actually like it and find it useful.

Fernando Q. Gouvêa is professor of mathematics at Colby College in Waterville, ME, where he has taught series and multivariable calculus more times than he'd like to count.