The original French passage of *Résumé du Cours d’Analyse Infinitésimale de l’Université de Gand *(*Survey of Infinitesimal Analysis, Ghent University*)* *can be found at [PM], while our English translation can be found in Appendix 2. Mansion began his passage by stating:

III. *A Wronskian *\( W(r,s,t,u) \)* is identically zero if one of the functions *\( r,s,t,u \) *is identically zero, or there exists between them a homogeneous linear relationship; and VICE VERSA*.

This statement is only partially valid, since Peano’s counterexample disproves this statement as well (Passage 2). We have already noted that the Wronskian of a set of functions will be zero if the functions are linearly dependent, and Mansion gave the same argument we provided earlier.

If \( r, \) for example, is identically zero, the same is true of the derivatives \( {r^{\prime}}, {r^{\prime\prime}}, {r^{\prime\prime\prime}}, \) and \( W, \) having a column of zeros, is also identically zero. If one has \( u=ar+bs, \) and as a result, \( {{u^{\prime}}=a{r^{\prime}}+b{s^{\prime}}},\) \({{u^{\prime\prime}}=a{r^{\prime\prime}}+b{s^{\prime\prime}}},\) \({{u^{\prime\prime\prime}}=a{r^{\prime\prime\prime}}+b{s^{\prime\prime\prime}}}, \) \( W\) will have a column of zeros, when one takes out the fourth column, the first multiplied by \( a \) and the second multiplied by \( b\).

The problem, then, must be in the proof of the “vice versa.” Mansion proceeded by induction, and while the induction step is (essentially) valid, there is a problem with the base case.

SIMILARLY, if \( W\) is identically zero, one of the functions \( r,s,t,u \)* *is equal to zero, or there exists between \( r,s,t,u \)* *a linear relationship. We suppose first this reciprocal is established for Wronskians of three lines and we prove that it is true for Wronskians of four lines. Thus, define \( k,m,n,p, \)* *the minors of \( W\) with regard to \( {r^{\prime\prime\prime}},{s^{\prime\prime\prime}},{t^{\prime\prime\prime}},{u^{\prime\prime\prime}}. \) These minors are themselves Wronskians of three lines. If one of them is identically zero, there exists a linear relationship between the functions involved (according to the hypothesis made for Wronskians of three lines), and the theorem is demonstrated.

The first mistake in this passage is simply assuming that the theorem is true for a \( 3\times 3\) Wronskian. While we know the theorem is false for two functions, technically Peano did not give a counterexample for three. However, the theorem is indeed false for three functions as well, as Bocher showed in [B3]. Despite the problem with the base case, we will reproduce the rest of Mansion’s argument, filling in details for clarity.

If none of the minors \( k,m,n,p \) is equal to zero, we consider the identical relations \[ kr+ms+nt+pu=0,\quad \left( {{\mathcal{Z}}_1}\right) \] \[ k{r^{\prime}}+m{s^{\prime}}+n{t^{\prime}}+p{u^{\prime}}=0, \quad \left({{\mathcal{Z}}_2}\right) \] \[ k{r^{\prime\prime}}+m{s^{\prime\prime}}+n{t^{\prime\prime}}+p{u^{\prime\prime}}=0, \quad\left({{\mathcal{Z}}_3}\right) \] \[ k{r^{\prime\prime\prime}}+m{s^{\prime\prime\prime}}+n{t^{\prime\prime\prime}}+p{u^{\prime\prime\prime}}=0, \quad\left({{\mathcal{Z}}_4}\right) \] obtained in expressing the properties of the minors of the zero determinant \( W\).** **Because \( {D_z}W=0,\) one still has, according to formula (2), \[ k{r^{iv}}+m{s^{iv}}+n{t^{iv}}+p{u^{iv}}=0, \quad \left({{\mathcal{Z}}_5}\right) \]

These relationships can be derived by taking the determinants of the matrices: \[ \left[ {\begin{array}{*{20}c} r & s & t & u \\ r & s & t & u \\ {r'} & {s'} & {t'} & {u'} \\ {r''} & {s''} & {t''} & {u''} \\ \end{array}} \right]\quad \left[ {\begin{array}{*{20}c} {r'} & {s'} & {t'} & {u'} \\ r & s & t & u \\ {r'} & {s'} & {t'} & {u'} \\ {r''} & {s''} & {t''} & {u''} \\ \end{array}} \right]\quad \left[ {\begin{array}{*{20}c} {r''} & {s''} & {t''} & {u''} \\ r & s & t & u \\ {r'} & {s'} & {t'} & {u'} \\ {r''} & {s''} & {t''} & {u''} \\ \end{array}} \right]\quad \left[ {\begin{array}{*{20}c} {r'''} & {s'''} & {t'''} & {u'''} \\ r & s & t & u \\ {r'} & {s'} & {t'} & {u'} \\ {r''} & {s''} & {t''} & {u''} \\ \end{array}} \right] \]

Each of these matrices has a zero determinant, and expanding along the top rows gives the equations \( \left( {{\mathcal{Z}}_1}\right), \left( {{\mathcal{Z}}_2}\right), \left( {{\mathcal{Z}}_3}\right), \) and \( \left( {{\mathcal{Z}}_4}\right), \) except for having alternating signs. It appears this was an error in the original printing, because the argument will work if the signs are treated carefully. The next step is to differentiate each equation \( \left( {{\mathcal{Z}}_1}\right), \left( {{\mathcal{Z}}_2}\right), \left( {{\mathcal{Z}}_3}\right), \) and \( \left( {{\mathcal{Z}}_4}\right), \) making use of the product rule and simplifying at each step. Again, the equations should have alternating signs.

Differentiating successively \( \left( {{\mathcal{Z}}_1}\right), \left( {{\mathcal{Z}}_2}\right), \left( {{\mathcal{Z}}_3}\right), \left( {{\mathcal{Z}}_4}\right), \) we simplify the derivative of each of these equations, by means of the following, \[ {k^{\prime}}r+ {m^{\prime}}s+ {n^{\prime}}t+ {p^{\prime}}u=0, \] \[ {k^{\prime}}{r^{\prime}}+{m^{\prime}}{s^{\prime}}+{n^{\prime}}{t^{\prime}}+{p^{\prime}}{u^{\prime}}=0, \] \[ {k^{\prime}}{r^{\prime\prime}}+{m^{\prime}}{s^{\prime\prime}}+{n^{\prime}}{t^{\prime\prime}}+{p^{\prime}}{u^{\prime\prime}}=0, \] \[ {k^{\prime}}{r^{\prime\prime\prime}}+{m^{\prime}}{s^{\prime\prime\prime}}+{n^{\prime}}{t^{\prime\prime\prime}}+{p^{\prime}}{u^{\prime\prime\prime}}=0. \]

Notice that this is simply a system of linear equations in the variables \( {k^{\prime}},{m^{\prime}},{n^{\prime}}, \) and \( {p^{\prime}}. \) Mansion solved for these variables. One way of doing so would be to rewrite the corrected system as

\[ {k^{\prime}}r- {m^{\prime}}s+ {n^{\prime}}t= {p^{\prime}}u, \] \[ {k^{\prime}}{r^{\prime}}-{m^{\prime}}{s^{\prime}}+{n^{\prime}}{t^{\prime}}={p^{\prime}}{u^{\prime}}, \] \[ {k^{\prime}}{r^{\prime\prime}}-{m^{\prime}}{s^{\prime\prime}}+{n^{\prime}}{t^{\prime\prime}}={p^{\prime}}{u^{\prime\prime}}, \] \[ {k^{\prime}}{r^{\prime\prime\prime}}-{m^{\prime}}{s^{\prime\prime\prime}}+{n^{\prime}}{t^{\prime\prime\prime}}={p^{\prime}}{u^{\prime\prime\prime}}. \] and then apply Cramer’s rule to see that: \[ k' = \frac{{\left| {\begin{array}{*{20}c} {p'u} & { - s} & t \\ {p'u'} & { - s'} & {t'} \\ {p'u''} & { - s''} & {t''} \\ \end{array}} \right|}}{{\left| {\begin{array}{*{20}c} r & { - s} & t \\ {r'} & { - s'} & {t'} \\ {r''} & { - s''} & {t''} \\ \end{array}} \right|}} = \frac{{p'\left| {\begin{array}{*{20}c} s & t & u \\ {s'} & {t'} & {u'} \\ {s''} & {t''} & {u''} \\ \end{array}} \right|}}{{\left| {\begin{array}{*{20}c} r & s & t \\ {r'} & {s'} & {t'} \\ {r''} & {s''} & {t''} \\ \end{array}} \right|}} = p'\frac{k}{p} \]

This yields the formulas given by Mansion in the following passage. These relationships are all derivatives of the logarithms of the functions, and so their integrals will all differ by a constant. This step will require that not only are \( k \) and \( p \) not zero, but neither are their derivatives. Integrating and keeping track of the constants will give a relationship between \( r,s,t,\) and \( u. \)

These four relationships give immediately, according to the properties of homogeneous linear equations and the definition of \( k,m,n,p ,\)
\[ {\frac{k^\prime}{k} = \frac{m^\prime}{m} =\frac{n^\prime}{n} =\frac{p^\prime}{p} },\quad {\rm{or}}\quad {Dlk=Dlm=Dln=Dlp}. \]

As a result, according to numbers \( 221\) or \( 103\), \( \alpha, \beta, \gamma \) being the constants, \[ lm=lk+l\alpha,\quad ln=lk+l\beta,\quad lp=lk+l\gamma, \]

\[ m={\alpha}k,\quad n={\beta}k,\quad p={\gamma}k. \]

Substituting these values of \( m, n, p\)* *into the identity \( kr+ms+nt+pu=0,\) it becomes, after division by \( k,\) \( r + {\alpha}s + {\beta}t + {\gamma}u = 0, \) a relationship which had to be demonstrated.

IV. REMARK. One can evidently establish analogous theorems to the previous ones on the Wronskians where the derivatives are replaced by partial derivatives or total differentials of the functions in question.