You are here

From Discrete to Continuous:The Broadening of Number Concepts in Early Modern England

Katherine Neal
Publisher: 
Springer Verlag
Publication Date: 
2002
Number of Pages: 
184
Format: 
Hardcover
Series: 
Studies in History and Philosophy of Science 16
Price: 
99.95
ISBN: 
978-1402005657
Category: 
General
[Reviewed by
Eisso Atzema
, on
01/13/2004
]

As we all know, the 17th century was a time of major upheaval in intellectual culture. One of these revolutions was deemed by Alexandre Koyré to be characteristic enough of the era to be reflected in the title of his well-known study Du monde clos à l'univers infini of the history of 17th century science. Between the middle of the 16th-century and the late 17th-century, the classical view of the world as a bounded space on a human scale not unlike the Shire in Tolkien's Lord of the Rings was replaced by the concept of an infinitely extending universe where even within our solar system all relevant distances go far beyond any human scale. But this one revolution in cosmology is mirrored by many others in almost any other field. In biology, to give an example, the discoveries of Van Leeuwenhoek and others made it clear that the traditional world held much more than could behold the mere eye. In history, the beginning of civilization was pushed back in time considerably. In mathematics, finally, the Scientific Revolution brought a considerable extension of the number concept from essentially the natural numbers to what we now call the number line. Before, whereas one might not have be able to count all numbers, one could at least think of them as a giant flock of individual numbers (or parts of numbers). In the course of the 17th century, number began to be equated with magnitude. Suddenly there where a whole lot more numbers and the flock analogy faltered (what is the square root of a bunch of individual numbers?). Although in this case mostly on a philosophical level, this transformation of the number concept was one of the many changes in mathematics that paved the way for the creation of the calculus.

As it is, this particular transformation sofar has not been studied in much detail. Of course, there is Klein's Greek Mathematical Thought and the Origin of Algebra, but in many ways this is more a philosophical than a historical study. Other than Klein, basically only Michael Mahoney, Helen Pycior and, very recently, Jackie Stedall have more than touched upon it. In the book under review here, Katherine Neal discusses this transformation of the number concept as it found expression in the works of a number of mathematicians working in early modern England (including Scotland). As far as the mathematicians selected, there are few surprises. After a general introduction, Neals examines the works of Robert Record and a few of the other arithmeticians of the time, Thomas Harriott, William Oughtred, John Napier & Henry Briggs and finally Isaac Barrow and John Wallis. Largely following the lines set out by Klein, Neal in each case investigates the author's view on the nature of number. For the most part, she ably recounts these views thus providing a coherent story on the topic, much of which could hitherto only be found scattered in the literature. Yet, after the whole reading the book I felt rather dissatisfied.

Other than in case of blatant incompetency or willful misrepresentation, it is always hard to pinpoint what exactly makes the reading of a book less rewarding. In this case I can think of a number of reasons, some of which have nothing to do with the book and everything with my own expectations. One important reason for my disappointment had much to do with the author's choice to closely work along Klein's lines. In fact, although Neal never explicitly says so, her study is really about what the authors under study have to say about the nature of number, i.e. their philosophy on the nature of number. Although this obviously is a legitimate topic of study, I would have liked to see more on how these views may have impinged on their actual work, if at all. This is an equally legimate question and a particularly relevant one for this stage of the development of algebra. Indeed, I would dare claim that along with this broadening of the number concept as such came a relative indifference among mathematicians about what should be counted as numbers or not and how two numbers might be different. Just as for Newton his law of gravitation was a perfectly acceptable result in spite of the philosophical difficulties surrounding the concept of action at a distance, for Wallis and most British mathematicians after him, the nature of number was not a mathematical question, but rather a philosophical one. All that counts for doing mathematics is that all numbers can be manipulated in the same way integers can.1 Since clearly for somebody like Vieta the nature of number was a question that he perceived to be relevant for doing mathematics, one may ask what exactly changed in the sixty odd years that separates them. Along the way, one would also have a nice opportunity to deal with the disdain heaped by Wallis and his likes upon philosophers such as Hobbes and Berkeley, whose main mistake seems to have been that they considered thinking about the foundations of mathematics to be part of doing mathematics. Neal omits Hobbes from her book altogether and in the case of Wallis basically sticks to the rather conflicting statements about the nature of number that he felt compelled to put into writing.

Because of her focus on the authors' philosophies of number, she also largely ignores another big issue of this era in the history of algebra. As Mahoney has brought up in his study of Barrow's mathematics, for authors such as Barrow and Wallis, a number could very well be defined as the solution to an equation and one could use various techniques to approach or exactly find the value of this number. Again, this would not necessarily have been Vieta's view and once more one might ask what changed. Related to this, I would be interested in knowing when logarithms began to be viewed as the solution of an equation, i.e. as the inverses of exponentials. In fact, I would be interested in knowing to what extent logarithms were looked upon as numbers in their own right at all. For most of the 17th century they seem to have been a magical tool more than anything else. Neal does not provide answers to these questions, but then it would be unfair to expect that she would. She does explain how logarithm tables were computed, but unfortunately her explanation is completely garbled.

Finally, there are two other topics that, although mentioned in passing by Neal, should have received much more attention. The first is to do with translation and reconstructions of the classics, particularly criticism of Euclid's Book II. Neal does discuss Barrow's rendering of Book II, but might it not be that a detailed study of more of these translations and reconstructions could have shed some more light on how the various authors viewed numbers and how these views fitted into the mathematical verison of the Battle of the Books (aka wars between ancients and moderns)? Talking about the latter, to what extent can actually this transformation of the number concept be interpreted as part of the war between ancients and moderns? Clearly Wallis is a modern, while Barrow might be an ancient. Neal touches upon this question a number of times, but only in passing. She does not address this issue explicitly (or explicitly enough).

The second topic is to do with the study of series by Wallis and others. It seems almost too much of a coincidence that at the same time that the irrational numbers become fully accepted as numbers, the study of "nice" series to represent the most important of these numbers really takes off. Could it be that Wallis' efforts to find a series for π at least partly sprang from an unease with the irregularity of the decimal representation of that number? Also, to what extent was the decimal representation of a number actually associated with a series? Was Leibniz the only one to use a different base (scale)?

Anyway, most of what I can find fault with in the case of this book is to do with what is not in it. As for what is included, I only have minor comments — mostly to do with the mathematics that she explains. I already mentioned the garbled explantion of the procedure to compute logarithms on p.108. At quite a few other places, her explanations of the mathematics are rather unclear, with sometimes key elements missing (see p.123 for instance, where Barrow's original on p.122 is much clearer on what the assumptions are). All in all the book probably does not stray too far from the dissertation that it is based on. As such, I find the book acceptable. At the same time, as so often with theses, I find it vaguely disappointing and I wished that the author had given herself a few more years to let her work mature into a study more deserving of publication as a full-fledged book.

Where I do not see any reason to blast the author for this book, I would like to take the opportunity to severely criticize her publisher. I think that Kluwer has sunk to new depths with the utterly unloving presentation of this book. I am not sure who was responsible for the make-up of the manuscript, but clearly the form in which ended up being published does not befit any serious publisher. Obviously, the manuscript was not revised at all. Especially at the beginning of the book, there are many typos — varying from misspelled names and words to mangled sentences. Page 44 even has all kinds: one mangled sentence, "scaler" for "scalar", "connivent" for "convenient" and "Descatres" for "Descartes". As for the (numerous) illustrations, they are mostly low-resolution scans often carelessly trimmed right through letters. In a number of cases, a mysterious number in a modern typeface shows up (see p.126 for an example of both). No attempt has been made to achieve anything like uniformity with these scans. Finally, the mathematics seems to be set in various typefaces (I counted at least three different ones). Not even Word's Equation Editor renders exponents as horribly as we find them on p.131 (which same page, I just noticed, has "Issac Barrow" for its header). Certainly, Kluwer could not be bothered to use simple standard-issue LaTex. For the $76.00 that Kluwer charges for this book, this total lack of care is an utter disgrace as well as an affront to the author.


Notes:

1.Cf p. 152, figure 5 which reproduces what Wallis had to say about the question whether one and zero are numbers: Hac autem de re, erat mihi quidem in animo, litem neutiquam movere; cum illa controversia Logica potius seu Metaphysica, quam Mathematica videri possit; saltem obiter monuisse, non modo Unum sed & Nullum, apud arithmeticos eodem omnino modo ac reliquos numeros tractari; adeoque apud illos (quicquid de Metaphysicis dicendum sit) vel pro numeris, vel quasi numeris reputari.


Eisso Atzema (atzema@math.umaine.edu) is Lecturer in Mathematics at the University of Maine at Orono.

Acknowledgments.
1. Transformation of the Number Concept.
2. The Ancient Sources.
3. The Contemporary Infuences.
4. Early Modern English Algebra.
5. The Development of the Logarithms: Napier and Briggs.
6. Isaac Barrow.
7. John Wallis.
8. Conclusion.
References. Indices.