In this book, Michel Blay offers us a close reading of several important 17th and 18th century mathematical and physical texts in order to explain what the author views as a fundamental change in our view of the relation between mathematics and reality. Blay argues that early in the century, especially in the work of Galileo and Huygens, the universe was viewed as being itself mathematically organized, so that their mathematical explanations made real claims about what was going on. At the end of the century, particularly in the work of Lagrange, one sees a different project, which Blay calls "mathematization", which is "an approach whose object is to reconstruct the phenomena of nature within the domain of mathematical intelligibility" (p. 3), but which no longer makes any claims about the objective reality of this mathematical model: scientific hypotheses are now truly mathematical thoughts, but the project has "put aside any meaningful ontological aims it might have had" (p. 4).
Blay connects this change with the appearance of the calculus. More specifically, he locates the change in the clash between the enormous power of the calculus as a tool for understanding nature and the deep problems involved in justifying its use of infinite and infinitesimal quantities. Faced with their inability to make sense of these infinities in any real sense, the philosophers retreated into the formalism of mathematics (in particular by distinguishing "mathematical infinity" from "metaphysical infinity") and gave up their claim to deal directly with reality.
For Blay, the crucial turning point in this story is the work of Fontenelle, who attempted in his 1727 Élements de la géométrie de l'infini to construct a rational theory of infinite quantities. Fontenelle's attempt was largely a failure (it could hardly be otherwise at the time), and this led thinkers to a fundamental change in their thinking:
This failure, which to a large degree put the future of the new science at risk, operated on two levels: on the one hand, it affected the mathematics on which the new science was based, since the theoretical difficulties concerning the foundations of the calculus of the infinite (...) were insurmountable at the beginning of the eighteenth century; and, on the other hand, it affected the larger interpretation of Fontenelle's work, which was not properly appreciated by his contemporaries, who concerned themselves only with the purely mathematical problems it raised.
The result was that geometrization once and for all gave way to mathematization. The success of the differential and integral calculus rapidly caused the old philosophical and theological issues to be forgotten, replacing them with wonder at the fecundity of the new technique (...)
Thus Lagrange's work, which enjoyed an impressive success at the horizon of knowledge in the latter part of the eighteenth century, was to be accompanied by a positivist neglect of the question of what meaning could be assigned to mathematization and to what uses mathematical physics might be put. (p. 11)
Even this short summary makes it clear that this is a very sophisticated argument, and one in which one can find a lot to argue with. Blay's central thesis, that there is a fundamental difference between how, say, Huygens and Lagrange conceived the relation between their mathematics and the universe, may very well be correct (though I'm not quite convinced the contrast is as sharp as he would like it to be). The change in point of view, however, happened in the context of many other cultural and philosophical changes, and the specific problem to which Blay points may have been only one of many factors which contributed to it.
However that may be, this is a historical and philosophical discussion that will only be of interest to a small minority of mathematicians and teachers of mathematics, and it seems reasonable to ask whether the book has anything to offer the rest of us. I think the answer is yes, for two reasons.
First, the book is a useful reminder of the difficulties involved in using mathematics (especially the calculus) to describe nature. For example, consider the difficulties that the main players in Blay's story had when attempting to understand uniformly accelerated motion and the transition from rest to motion. Galileo argued at length that a "rising heavy body does not persist for any finite time in any one degree of speed" (p. 91), so that such a body achieves infinitely many speeds in the infinite number of instants contained in a finite amount of time. Liebniz agreed, formulating his famous "law of continuity": "nature never makes leaps."
These ideas were far from being generally accepted. Descartes, for example, did not believe in the "law of continuity", particularly with regard to his theory of collisions. Others seemed to believe in a discrete physical world where lengths and time cannot be infinitely divided. This, of course, leads to all sorts of difficulties. One finds people arguing, for example, that bodies move more or less quickly as a function of the amount of rest that is mixed in with their motion:
This is why, when there are two moving bodies, one of which moves twice as fast as the other, it is necessary to conceive that, of two moments, in both of which the faster one moves, the less rapid one moves only in one and rests in the other... (F. Bernier, Abrégé de la philosophie de Gassendi, quoted on p. 95.)
The whole attempt to understand motion caused great confusion, essentially because it required an understanding of the structure of the real numbers and of the notion of continuity. It hard to know what to think, for example, when one finds Leibniz arguing that "we may consider rest as infinitely small motion (that is, as equivalent to a particular instance of its own contradictory)" (p.92). The idea that "motion" and "rest" were two different kinds of things collided with the infinite divisibility of the line to produce great confusion.
This discussion, and related ones such as one about (what we would nowadays describe as) the "first point in an open interval" (the discussion at the time was about what happens when an object "just begins" to move), can serve to remind us of how truly difficult it is to understand the real line, and how many pitfalls lurk behind the apparently straightforward idea of using functions to model physical motion. Could some of these difficulties be lurking in the minds of our calculus students as we introduce them to the notion of the derivative as an "instaneous velocity"?
Similar comments could be made about other topics discussed in the book. Fontenelle, for example, finds himself arguing that some finite quantities (he calls them "indeterminable finite quantities") become infinite when squared, for reasons that are directly connected with arguing about the "largest" or "smallest" element of certain collections of finite and/or infinite numbers. There is much here to remind us that our conception of the real numbers and our understanding of continuous functions are the result of an enormous amount of intellectual work, and that understanding these concepts may be something students find quite difficult.
The second useful thing the book offers us is a large selection of texts from the period under study. Blay's book includes substantial extracts from the writings he studies, often translated from other languages, which are discusses in detail. There are many of these that could be used in class (particularly in a calculus class or in a first class on the foundations of the calculus) to stimulate students to think seriously about the mathematics of motion, or continuity, and of the structure of the real line. Similarly, explaining what is wrong with Fontenelle's "indeterminable finite quantities" would be a great exercise for students learning set theory. Thus, the book can be of great value for instructors who wish to use original sources in their teaching.
Working through a historical book like this one can add significant richness both to our understanding of mathematics and to our teaching. While Blay's technical and philosophical argument is quite dense and open to disagreement, this book is worth the effort, both for the perspective it can give us on the real difficulties hiding in the calculus and its application to physical processes and for the teaching opportunities it creates.
Fernando Q. Gouvêa is associate professor of mathematics and chair of the Department of Mathematics and Computer Science at Colby College. He was a participant in the Institute on the History of Mathematics and its use in Teaching, an MAA professional development project that has had an enormous impact on his teaching and research interests.