This entry in the Boston Studies in the Philosophy and History of Science series is one of over three hundred in the series exploring broad interdisciplinary concerns. This volume features international contributions from mathematicians and philosophers, theorists and practitioners. Springer asserts that the series “looks into and reflects on interactions between epistemological and historical dimensions in an effort to understand the scientific enterprise from every viewpoint.” Here in *Mathematics as a Tool* is a historical arc investigating the origin, nature, methods, and limits of mathematics when applied in science to understanding the real world looking back to the *Almagest* and forward to “big data”. The question, as says contributor Ann Johnson (1965–2016) of Cornell University, to whom this volume is dedicated, is to understand “why engineers mathematized at all and why they mathematized in different ways in different places.”

With a breadth not much less than the series and fewer pages then the series has volumes, this entry is necessarily high-level and diverse enough to lack focus. From the generalities emerges a discussion of mathematical modeling yesterday and today that is enlightening for future practitioners and grounding for those seeking to understand the “mathematization of science” within the philosophy of science. When does mathematics express the laws of nature and when is it merely systematizing observational knowledge? How is mathematics employed ontologically and contextually? The spectrum from the general to the particular raises different questions of validity and technique. Rather than answering such questions, the collection of essays raises the issues effectively and seeks to frame the questions for formal consideration. As Henrik Kragh Sørensen observes here, “The relation between mathematics and the sciences is a historically complex one that is difficult to analyze philosophically in all its nuances.” The editors have gathered articles assaying several key nuances.

As an aid to formal consideration, the volume’s editors “…propose five characteristic aspects of mathematics as a tool…” These five being:

- mathematics “as a mediator between theory and observation”,
- within data-driven research (particularly “big data”),
- model tuning,
- approaches to equation solving (Navier–Stokes, etc.), and
- non-representational idealizations.

For non-representational idealizations, think of Euler’s 18th Century application of ideal flow theory: “modern scientists and historians ridicule Euler’s idealization as the origin for the schism between theory and practice.” (It may be bad enough that “When Euler applied his equations to design a fountain for Frederick the Great of Prussia, it failed to work” without blaming the long-lived debate on Euler, too.) In “Approaching Reality by Idealization”, contributor Michael Eckert finds idealization enlightening for the modeler, because of its shortcomings in explaining observations accurately: “the ideal and the real are no antagonists. The study of ideal flow is … crucial for understanding [real flow].”

Rather than Johnson’s “Why”, the question answered most often is the more prosaic “How?” Thus, without meaning to be so, the book is also a gallery of mathematical modelling approaches applicable to a wide range of problems. Certainly of value for those entering the craft, this includes natural language processing, big data considered formally, and ODE systems for synthetic biology.

Some models may have lost their value with new understanding, yet there is value in understanding the development of the models. This includes the beguiling intricacy of Ptolemy’s epicycles and deferents. Regardless of the validity of historical or proposed models here, there is value to the would-be modeler in understanding “the mathematical modeling process as an iterative and dialectic process.” The work contains a panoply of examples that serve not as an introduction but as an aid in transition from theory to practice. Indeed, practice and application is the common default to which each piece collects tends. Contributor David Aubin asserts that this is inevitable in that the “conception of the applicability of mathematics will slide the problem away from concepts, laws, and language to *practice*.” Often here the beginning feels like an –ology and the conclusion deserves the prefix “computational.” That is, the mathematics tool helps frame the open problem from the natural world. Mathematization of the model supports a theory allowing prediction and control. Often an inter-disciplinary field’s founding as a result is recalled in these monographs as the modeling process often requires cross-pollination. “Modelers however are not typically biologists but come from engineering and other quantitative fields.”

Also apparently unintended yet valuable is a common thread of the computerization of mathematization. Feeding into that discourse essential to modeling is simulation. From “Boon and Bane: On the Role of Adjustable Parameters in Simulation Models” we read that “Simulation then does not merely extend mathematical modeling, but adds a new twist to it. Now classical and simulation experiments cooperate, building on the feedback loop…” Contributor Miles MacLeod notes that managing complexity in models is the promise of computerization: “there is a growing expectation that the power of modern computation will supplant the limitations of traditional modeling methods when dealing with highly complex phenomena like biological and climate systems.” The same author suggests the promise is more sizzle than steak: “In systems biology the rhetoric of what computational and mathematical methods can achieve has arguably got ahead of what is actually being accomplished.” MacLeod continues to hem in the computer’s role with, “in order for philosophy of science to develop more realistic images of what computational and mathematical methods can […] we should concentrate on their roles as *tools of investigation*.”

As a tool, computers are seen here as not a convenience, but a necessity for working many modern models. Per contributors Hasse and Lenhard “the (theoretical) models often cannot be studied directly. They first have to be implemented on computers.” In the feedback loop, computers increase complexity at the same they enable managing it. Observe the same pair of contributors: “This sprawling of variants cannot be solely attributed to the use of computers but it is certain that computers strongly accelerated that development. They have also favored the increase of the number of parameters in a model….” Contributor Nicolas Fillion further explores this explicitly in “The Vindication of Computer Simulations” including justification of computer simulations within the philosophy of science while MacLeod is a bit more bearish about the “limitations to the degree to which computational and mathematical really can overcome parametric uncertainty. …there is a question mark over the purpose and value of computational modeling in any field.” As a tool, computers appear here to be akin to a lever; a force multiplier: “computer experiments yield new opportunities compared to classical experimentation…”

There is also something here for developers of curricula. Developing and using reasonably accurate real-world models requires an integrated knowledge of mathematics, particularly, a statistical basis to analytical techniques. From the introduction: “Mathematics educators do address these ideas when they worry about ordering the secondary mathematics curriculum. Students may learn calculus but doing so will not necessarily help them understand statistics.” The integration of both is needed to develop or apply meta-study concepts as reviewed in “Systems Biology in the Light of Uncertainty: The Limits of Computation.” Indeed, for the would-be modeler statistics proves in this compendium to be the glue between the theory and phenomenon modeled. Often, this is averred precisely. “The number of parameters in a physical model should never exceed the number of adjustable parameters in a purely mathematical (statistical) correlation of the studied phenomenon…” Compare this to the clarion call for early exposure to differential equations for future modelers in "An Alternative Calculus Sequencing" (Focus August/September 2017). That article's point that "many science and engineering phenomena modeled by differential equations do not require in-depth knowledge of the disciplines" is supported by nearly every case study herein.

Expounding on the philosophy of model building with several historical examples, this collection of papers highlights approaches to modeling and underpinning necessities. Among the necessities highlighted is an overview of theory of error analysis, a few diagrams of the necessary high-level modeling steps, and transforming the model into mathematical terms (“idealizations play a central role in this transformation”). Modeling is an obvious application of the mathematics tool. This common instance of mathematics employed in science falls into the more general umbrella summarized by contributor Jürgen Jost: mathematics “consists in providing a framework for conceptual thinking and formal reasoning. This is valid across all disciplines, and actually badly needed in many of them.” Jost’s concluding contribution nicely points to what the considerations contained herein support: “there exist profound analogies between models in physically very different domains. It is actually one of the main driving forces of mathematics to consider the corresponding structures and relations abstractly and independently of any particular instantiation and to work out general theories to new domains… For instance, similar statistical phenomena show up in quantum mechanics, in the analysis of biological high-throughput data, or in the description of social phenomena.” Mathematics, one tool for all science.

Tom Schulte teaches mathematics at Oakland Community College and has a home workshop with tools going back to his former career as a metal model maker at General Motors truck engineering.