Calculus on the European continent began with Gottfried Wilhelm von Leibniz (1646-1716), who conceived of it as the manipulation of infinitely small increments, called differentials, of the variables in an equation. An infinitely small increment in a variable \( x \) is denoted \( dx \) and the rules of Leibniz' calculus allowed him to conclude, for example, that if \( y=x^2 \), then \( dy = 2x dx \). Modern day readers will notice that they can extract the derivative of \( y \) from this equation by formally dividing both sides by \( dx \) in order to get \( \frac{dy}{dx} = 2x \), but mathematicians did not speak of derivatives during the decades following the birth of the calculus. To them, the differentials \( dx \) and \( dy \) were both objects in their own right, variable quantities of infinitely small size, related to one another so that at any point \( (x,y) \) on the parabola, an infinitely small increment \( dx \) in \( x \) results in a corresponding increment of \( 2xdx \) in the variable \( y \).

**Figure 2. **Jean le Rond d'Alembert. Pastel drawing by de la Tour (public domain).

In the 1750s, the French mathematician Jean le Rond d'Alembert (1717-1783), wrote that the limit concept was the "true metaphysics of the differential calculus" [Calinger 1995, pp. 482-485] in Denis Diderot's (1713-1784) influential *Encyclopédie* [Diderot 1751]. D'Alembert was not the first person to use limits; for example, Colin Maclaurin (1698-1746) discussed them at some length in his 1742 book *A Treatise of Fluxions* [Maclaurin 1742]. However, d'Alembert gave limits a high profile by championing them in the widely-read French encyclopedia. Later in the 18th century, the Swiss mathematician Simon Antoine Jean Lhuilier (1750-1840) won the 1786 Berlin Academy prize for an essay [Lhuilier 1785] in which he gave a systematic account of the calculus using derivatives and limits. By this point, about a century after Leibniz' invention of the differential calculus, the subject had assumed a form that would be more or less recognizable to modern readers. However, the late 18th century limit concept was an informal one, relying on the sort of intuitive arguments still made in most freshman calculus courses. The \( \varepsilon\)-\( \delta \) formulation was still to come, so at this time, differentials and limits were simply competing informal notions.