Someone unfamiliar with the history of the field might reasonably assume that the algorithms used in scientific computing were developed after the computers on which they run. That might be true if you were to somehow count the total number of algorithms in use, but it is certainly not true if you count frequency of use. Most of the commonly used algorithms, and most of the algorithms you’re likely to encounter in an introductory class on numerical methods, predate computers.

Another surprise is that Monte Carlo simulation was *not* developed before the advent of computers. Monte Carlo simulation is a simple idea, much simpler than, say, the Ritz-Galerkin method for solving PDEs, just to pick an algorithm that predated electronic computers. Perhaps computers indirectly inspired Monte Carlo methods because computers encouraged working on much larger problems. Who would want an answer that is *probably* good if the alternative was to have an answer you *know* is good? But computers encouraged researchers to attempt problems too difficult to solve deterministically, and an answer that probably approximately correct is better than no answer.

One curious bit of history is that the Fast Fourier Transform (FFT) belongs to both the pre-computer and post-computer eras. We now know from his unpublished papers that Gauss developed a version of the FFT in 1805, but we usually date the FFT to a paper by Cooley and Tukey in 1965. (Interestingly, Gauss’s unpublished work on the FFT even predates Fourier’s publication on Fourier series in 1822.)

Bertil Gustafsson’s book *Scientific Computing: A Historical Perspective* surveys numerical methods from some of the earliest methods, such as the quadratic equation, to recent methods, such as wavelet analysis and multigrid methods. Roughly one third of the book is devoted to pre-computer algorithms, one sixth to algorithms from the beginning of the computer era, and one half to more recent methods.

Numerical methods for linear algebra and differential equations are a large part of the book, which is appropriate since they are a large part of scientific computing. Other than Monte Carlo methods, Gustafsson does not devote much space to statistical methods.

*Scientific Computing* would make a good complement to a standard numerical methods textbook. It doesn’t go into great detail — how could it, covering centuries of development in 262 pages? — but it provides an insight into the motivation and development of methods barely mentioned in a numerical methods textbook.

John D. Cook is an independent consultant working in information privacy.