**Hermite's ****Cours d’analyse de l’école polytechnique**

Peano first referenced Charles Hermite’s 1873 textbook, *Cours d’analyse de l’école polytechnique* ([H]). Hermite actually was “proving” a theorem we saw stated in *Sur les Wronskiens *by Peano, namely:

A particular case of the general theorem is as follows: If the determinant formed with the derivatives of orders \( 1, 2, 3 \) of the functions \( x, y, z \) of the same variable \( t\) is identically zero,** **the curve described by the coordinate point \( x, y, z \) is planar.

Here is Hermite’s simpler version when \( z=t.\)

We suppose in fact, for greater simplicity, \( \theta(t) = t,\) so that, \( z\) becoming the independent variable, the equations of the curve are \[ x=\phi (z),\quad y=\psi (z).\] We will find then \[ \Delta = {{\left| {\begin{array}{*{20}c} {{\phi}'} & {{\phi}''} & {{\phi}'''} \\ {{\psi}'} & {{\psi}''} & {{\psi}'''} \\ 1 & 0 & 0 \\ \end{array}} \right|}} = {{\phi}''}{{\psi}'''}-{{\phi}'''}{{\psi}''} \] and the condition \( \Delta =0,\) in dividing by \( {{{\psi}''}^2},\) takes this form \[ {\left({\frac{{\phi}''}{{\psi}''}}\right)}' =0,\] where one concludes \[ {{\phi}''}=a{{\psi}''},\] \( a\)* *designating an arbitrary constant. It results very easily \[ {{\phi}'}=a{{\psi}'} + b,\] then finally \[ \phi = a\psi + bz + c,\] that is (to say), \[ x = ay + bz + c,\] so that the proposed curve is in fact completely contained in the same plane.

We see Hermite divided by \( {{{\psi}''}^2}\) in the proof, and so there is an implicit requirement that \( {{{\psi}''}^2}\) cannot be zero. For two functions, this is actually an important condition, as noted by Bocher in [B3, p. 140]:

**Passage 6.**** **Image used with permission from the American Mathematical Society.

Notice how this would eliminate Peano’s counterexamples (Passage 2 and Passage 5) if the interval is the real number line, since both *Sur le déterminant Wronskien* and *Sur les Wronskiens* have examples where each function has a root at the origin. In [B3], Bocher gave an example of three functions that are linearly independent and have a zero Wronskian, yet no function attains the value of zero on the interval in question.

**Jordan's ***Cours d’analyse de l’école polytechnique*

The relevant passage from Camille Jordan’s textbook occurs on page 149 of [J], and a direct translation with no analysis can be found in Appendix 2. He began by stating that the Wronskian not being identically zero is a necessary and sufficient condition for functions \( x_1,\dots, x_n \) to be linearly independent.

122. One can note that the condition \[ \left| {\begin{array}{*{20}c} {x_1 } & {} & \cdots & {} & {x_n } \\ \vdots & {} & \ddots & {} & \vdots \\ {x_1^{n - 1} } & {} & \cdots & {} & {x_n^{n - 1} } \\ \end{array}} \right| \ne 0 \] expresses the necessary and sufficient condition for there to exist between the functions \( x_1,\dots, x_n \)* *no linear relationship with constant coefficients,** **such that \[ {\alpha}_1 x_1+\cdots + {\alpha}_n x_n =0.\]

He started with the standard—and valid—argument that we have given earlier, showing that if the functions are linearly dependent, then the Wronskian will be zero.

In fact, if there existed a relationship of this type, one would obtain, in differentiating it, \[ {\alpha}_1 {x_1^\prime} +\cdots + {\alpha}_n {x_n^\prime} =0,\] \[ \vdots \] \[ {\alpha}_1 {x_1^{n-1}} +\cdots + {\alpha}_n {x_n^{n-1}} =0,\] and, in eliminating the parameters \( {\alpha}_1 ,\dots , {\alpha}_n ,\) it would become \[ \left| {\begin{array}{*{20}c} {x_1 } & {} & \cdots & {} & {x_n } \\ {x_1^\prime} & {} & \cdots & {} & {x_n^\prime} \\ \vdots & {} & \cdots & {} & \vdots \\ {x_1^{n - 1} } & {} & \cdots & {} & {x_n^{n - 1} } \\ \end{array}} \right| = 0. \]

The other implication is the one in which we are interested. Jordan replaced the first column of the Wronskian with the differentials \( X, X^{\prime}, X^{\prime\prime},\dots , X^{n-1}.\) Then expanding the determinant of this new matrix along the first column and setting it equal to zero defines an \( (n‐1)^{\rm th} \) order linear differential equation \[ p_{n-1} (x) X^{(n-1)} + p_{n-2} (x) X^{(n-2)} + \cdots + p_{1} (x) X^{\prime} + p_{0} (x) X =0, \] where \( {p_k}(x)\) is the cofactor formed by eliminating the first column and the \( (k+1)^{\rm th} \) row.

All the functions \( x_1, x_2,\dots, x_n \) are solutions to this differential equation. If we substitute \( x_1\) into the differential equation, it vanishes by the hypothesis of the Wronskian vanishing. If we substitute any other \( x_2,\dots, x_n, \) the matrix will have two identical columns, which would have a zero determinant.

Jordan then noticed this gives \( n \) solutions to an \( (n‐1)^{\rm th} \) order linear differential equation. By the standard uniqueness theorems, a fundamental set of solutions should have only \( n‐1 \) functions, and he proceeded to argue there must be a relationship between \( x_1, x_2,\dots, x_n .\)

Similarly, if this determinant is zero, \( x_1,\dots, x_n \) will be \( n \)* *particular solutions of the linear equation of order \( n-1 \) \[ \left| {\begin{array}{*{20}c} {X} & {x_2} & \cdots & {x_n } \\ {X^\prime} & {x_2^{\prime}} & \cdots & {x_n^\prime} \\ \vdots & {\vdots} & \cdots & \vdots \\ {X^{n - 1} } & {x_2^{n-1}} & \cdots & {x_n^{n - 1} } \\ \end{array}} \right| = 0. \] One will have therefore, in designating** **by \( X_1,\dots, X_n \) the independent solutions of this equation, and by \( C_1^{\prime}, \dots , C_{n-1}^n \) the constants, \[ x_1 ={C_1^\prime}X_1 +\cdots + {C_{n-1}^{\prime}}X_{n-1} ,\] \[ \vdots \] \[ x_n ={C_1^{n}}X_1 +\cdots + {C_{n-1}^{n}}X_{n-1}.\] Eliminating \( X_1,\dots, X_n \) between these equations, a linear relationship between \( x_1,\dots, x_n \) will be deduced.

In order to apply the uniqueness theorem for linear differential equations, one needs to require that all the \( p_{k} (x) \) are continuous and that \( p_{n-1} (x) \) has no roots on \( I. \) This condition is exactly what fails in the case of Peano’s examples. This is most easily seen in his second example from Passage 5. Both \( u_{1} = | x| \,x \) and \( u_{2} =x^2 \) are solutions to the first order linear differential equation \[ -{x^2}{y^\prime} + 2xy =0 \] and satisfy the initial condition \( y(1)=1 \). But we cannot conclude that \( u_{1} =c u_2 \) on the real number line because of the fact that \( p_{n-1} (x) \) (which in this case is \( u_{1} \)) has a root at zero. Notice that in fact \( u_{1} =c u_2 \) on any interval that does not contain the origin.

Jordan’s argument becomes correct if we require \( p_{n-1} (x)\,{\not=}\, 0 \) and all the \( p_{k} (x) \) to be continuous. In the case of two functions, these additional requirements are equivalent to the ones outlined by Bocher in Passage 6 above. The case of more functions is similar to *Peano’s Second Theorem* (Passage 3) because \( p_{n-1} (x) \) is the Wronskian of \( u_{2} (x) ,\dots , u_n (x). \)

**Laurent's ***Traité d’Analyse*

Peano referenced the textbook *Traité d’analyse* written by Hermann Laurent in 1885 ([L]). The section with which Peano was concerned turns out to be a brief note of clarification added to the end of the chapter on determinants and implicit functions. Laurent wrote:

4. If one has $$ \left| {\begin{array}{*{20}c} {x_1} & {x_2} &\cdots & {x_n }\\ {dx_1} & {dx_2} &\cdots& {dx_n}\\ \vdots& {\vdots} &\cdots &\vdots\\ {d^{n - 1}x_1 } & {d^{n-1}x_2} &\cdots& {d^{n - 1} x_n}\\ \end{array}} \right| = 0, $$ one has necessarily, between \( x_1, x_2,\dots, x_n ,\) a relation in the form of \[ a_1 x_1 + a_2 x_2 +\cdots + a_n x_n =0, \] \( a_1, a_2,\dots, a_n \) designating the constants.

There is little to say about this statement except that it is incorrect to assume a zero Wronskian indicates dependence between functions, and that this idea is exactly what Peano disproved in his two notes.