Researches in the Integral Calculus. Part I

Author(s) H. F. Talbot
Year 1836
Volume 126
Pages 40 pages
Language en
Journal Philosophical Transactions of the Royal Society of London

Full Text (OCR)

§ 1. Brief Historical Sketch of the Subject. The first inventors of the integral calculus observed that only a certain number of formulæ were susceptible of exact integration, or could be reduced to a finite number of terms involving algebraic, circular, or logarithmic quantities. When this result could not be attained, they were accustomed to develop the integral in an infinite series. But this method, although useful when numerical values are to be computed, is entirely inadequate, in an analytical point of view, to supply the place of the exact integral; for the progress of analysis has shown many instances of exact relation between different integrals which cannot by any means be inferred from the infinite series in which they are developed. The first great improvement beyond this was made by Fagnani about the year 1714. This most acute and ingenious mathematician proposed the following question to the scientific world in an Italian journal*: "Given a biquadratic parabola whose equation is $x^4 = y$, and an arc of it, to find another arc, so that their difference may be rectifiable." No answer appearing, he published a solution of the problem in the year 1715†, and extended it in a nearly similar manner to other curves whose equation is $x^n = y$, viz. to those cases where $n$ equals one of the numbers $3, \frac{5}{2}, \frac{7}{3}, \frac{9}{4}, \frac{11}{5}, \frac{13}{6}$. In the year 1718 and afterwards he published a variety of important theorems respecting the division into equal parts of the arcs of the lemniscate, and respecting the ellipse and hyperbola, in both of which he showed how two arcs may be determined of which the difference is a known straight line. These discoveries justify us in regarding Fagnani as the founder of a new and very curious branch of analysis. Euler, who enriched almost every department of science with new discoveries, exhibited the complete algebraic integral of the equation $$\frac{dx}{\sqrt{\alpha + \beta x + \gamma x^2 + \delta x^3 + \varepsilon x^4}} + \frac{dy}{\sqrt{\alpha + \beta y + \gamma y^2 + \delta y^3 + \varepsilon y^4}} = 0;$$ a remarkable theorem, which long continued to be the ne plus ultra of this branch of science, little success having attended the endeavours of mathematicians to arrive at results of greater generality. * Giornale de' Letterati d'Italia, tom. xix. p. 438. † tom. xxii. p. 229. The excellent work of Legendre* was destined to arrange, classify, and distinguish the properties of elliptic integrals which are implicitly contained in Euler's theorem above mentioned. In this treatise he has thoroughly examined the nature of these transcendentals, and presented the results of his inquiries in a luminous and well-arranged theory. The extensive tables which accompany his work will enable future mathematicians to make as frequent and convenient use of elliptic integrals as they have hitherto done of circular and logarithmic ones. In the year 1828 Mr. Abel, of Christiania in Norway, published a very remarkable theorem, which gives the sum of a series of integrals of the form \( \int \frac{P dx}{\sqrt{R}} \), where \( P \) and \( R \) are entire functions of \( x \), of the form \( x^n + ax^{n-1} + bx^{n-2} + \ldots \); \( n \) being any whole positive number, and \( a, b, \&c. \) constant coefficients. This theorem extends much further than Euler's, in as much as the latter is limited to those forms of \( R \) which contain no higher powers of \( x \) than the fourth. It departs still more widely from Euler's theorem, in exhibiting the sum, not of two only, but of many integrals of the same form. And it must be observed that this plurality of terms is in general necessary; for if we give to the expression \( \int \frac{dx}{\sqrt{R}} \) its utmost generality, it does not appear possible to find the sum of only two such integrals in finite algebraic, or logarithmic terms; but it is requisite to combine a greater number of them, below which number the problem cannot be reduced. Abel's theorem in general furnishes a multitude of solutions for each particular case of the problem: notwithstanding which it is possible to find other solutions which appear not to be comprised in his theorem, nor deducible from it†. On the publication of this theorem the illustrious Legendre, who at an advanced age still cultivated his favourite science with all the ardour of youth, was one of the first to feel its extent and importance. And accordingly, with a degree of zeal almost unequalled in the annals of science, he devoted a large portion of time to the verification and elucidation of the theorem by numerical examples. The result of these calculations was amply confirmatory of its truth, and it therefore undoubtedly stands upon the basis of rigorous demonstration. There can be little doubt that the ingenious mathematician to whom we are indebted for this theorem would have arrived at fresh discoveries, of not inferior value, --- * Exercices de Calcul Intégral. Paris, 1811. Traité des Fonctions Elliptiques. Paris, 1825. † For instance, if \( \frac{dx}{\sqrt{1+x^4}} + \frac{dy}{\sqrt{1+y^4}} = 0 \), his theorem gives the integral \( xy = 1 \); but, apparently, it does not give this other integral \( y^2 = \frac{\sqrt{1+x^4}+x\sqrt{2}}{\sqrt{1+x^4}-x\sqrt{2}} \), which was discovered by Fagnani (Produzioni Matematiche, vol. ii. p. 369.). if a premature death had not terminated his career, to the irreparable loss of science, at the early age of twenty-seven. Before concluding this slight historical sketch of the subject, I ought not to omit mention of a valuable recent memoir by M. Poisson*, in which he has considered various forms of integrals which are not comprehended in Abel's formula. It has already been stated that the integrals to which Abel's theorem relates are those comprised in the general expression $\int \frac{P}{\sqrt{R}} \, dx$, where $P$ and $R$ are entirely polynomials in $x$. Next in order of succession to these there naturally presents itself the class of integrals whose general expression is $\int \frac{P}{\sqrt[3]{R}} \, dx$, where the polynomial $R$ is affected with a cubic instead of a quadratic radical. But Abel's theorem has no reference to these, and consequently affords us no assistance in their solution. The same may be said with regard to the succeeding classes of integrals, $\int \frac{P}{\sqrt[4]{R}} \, dx$, $\int \frac{P}{\sqrt[5]{R}} \, dx$, and generally $\int \frac{P}{\sqrt[n]{R}} \, dx$. Still less does it enable us to find the sum of such integrals as $\int \phi(R) \, dx$, $R$ being as before an entire polynomial†; and $\phi$ any function whatever. This is the problem to the solution of which the following pages will be dedicated. I may be here permitted to mention, that Abel's theorem was unknown to me until some years after its publication, and that these Researches were nearly completed before I was acquainted with it. I have, however, made no alteration in them, but have chosen to present the subject in the manner in which it originally occurred to me. I am not aware that Mr. Abel has left any memorial of the successive steps of reasoning by which he arrived at his theorem. Probably they were very different from those which I have employed, and therefore I have detailed at some length my method of investigation, beginning with the first rudiments of the theory at which I afterwards arrived. § 2. It was remarked by the earliest inventors of the integral calculus, that there was a mutual dependence between the two integrals $\int y \, dx$ and $\int x \, dy$, so that if the one were given the other became known, by virtue of the equation $$\int x \, dy + \int y \, dx = xy + C.$$ If therefore one of these forms happened to be more easy of integration than the other, they directed it to be substituted for it. * Crellé's Journal, vol. xii. p. 89. Berlin, 1834. † By "polynomial" I here understand an expression containing at least two different powers of $x$. There is, however, one case in which no alteration is produced by this substitution, and that is when the variable \( x \) is the same function of \( y \) that \( y \) is of \( x \); or when \( x = \phi y, \ y = \phi x \). For then the integral \( \int y \ dx \) or \( \int \phi x \ dx \) has the same form with \( \int x \ dy \) or \( \int \phi y \ dy \). In this case therefore \[ \int \phi x \ dx + \int \phi y \ dy = xy + C. \] This equation holds good whether \( \int \phi x \ dx \) can be integrated in finite terms, or whether it cannot. The equations \( x = \phi y \) and \( y = \phi x \), manifestly imply that a symmetrical equation exists between \( x \) and \( y \), and its symmetry is the only requisite condition. In other respects it may be any whatever. Notwithstanding the simplicity of this reasoning, it does not appear that any mathematician before Fagnani clearly perceived the important consequences which might be deduced from it. But he has obtained from it the following important theorem respecting the arcs of the hyperbola. If \( x \) be the abscissa of a hyperbola whose principal semi-axis \( = 1 \), its arc \[ = \int d x \sqrt{\frac{e^2 x^2 - 1}{x^2 - 1}}, \] where \( e \) is the eccentricity, or the distance between the centre and focus. Let \( y \) be another abscissa, so related to the former that \[ e y = \sqrt{\frac{e^2 x^2 - 1}{x^2 - 1}}, \] whence \[ e^2 x^2 y^2 = e^2 (x^2 + y^2) - 1. \] This equation being symmetrical with respect to \( x \) and \( y \), it follows that those letters may be permuted, \[ \therefore e x = \sqrt{\frac{e^2 y^2 - 1}{y^2 - 1}}. \] Multiplying these equations respectively by \( d x \) and \( d y \), and then adding them, \[ e y \ dx + e x \ dy = dx \sqrt{\frac{e^2 x^2 - 1}{x^2 - 1}} + dy \sqrt{\frac{e^2 y^2 - 1}{y^2 - 1}}. \] \[ \therefore e x y + C = \int d x \sqrt{\frac{e^2 x^2 - 1}{x^2 - 1}} + \int d y \sqrt{\frac{e^2 y^2 - 1}{y^2 - 1}}, \] which is the theorem in question. Since the arc of an ellipse may also be expressed by the formula \( \int d x \sqrt{\frac{e^2 x^2 - 1}{x^2 - 1}} \), it may be asked whether the theorem applies to the ellipse, or to the hyperbola, or to both curves? Let us therefore return to the equation \[ e^2 x^2 y^2 = e^2 (x^2 + y^2) - 1, \] whence \[ -(x^2 + y^2) + x^2 y^2 = -\frac{1}{e^2} \] \[ (1 - x^2)(1 - y^2) = 1 - \frac{1}{e^2}. \] In the ellipse the abscissæ \( x, y \) are necessarily both less than 1, and in the hyperbola they are both greater than 1. Therefore in either case the product \((1 - x^2)(1 - y^2)\) is a positive quantity, \(\therefore 1 - \frac{1}{e^2}\) must be a positive quantity, which gives \(1 > \frac{1}{e^2}\), or \(e > 1\). This condition obtains in the hyperbola but not in the ellipse, therefore the theorem is not applicable to the latter. An analogous theorem, however, exists for the ellipse, which I shall not now stop to examine. In imitation of the above proceeding, let us make the more general supposition \[ ey = \left( \frac{e^n x^n - 1}{x^n - 1} \right)^{\frac{1}{n}}, \] whence \[ e^n x^n y^n = e^n (x^n + y^n) - 1, \] a symmetrical equation; \[ \therefore ex = \left( \frac{e^n y^n - 1}{y^n - 1} \right)^{\frac{1}{n}}. \] Proceeding as before, we find \[ ey + C = \int \left( \frac{e^n x^n - 1}{x^n - 1} \right)^{\frac{1}{n}} dx + \int \left( \frac{e^n y^n - 1}{y^n - 1} \right)^{\frac{1}{n}} dy \] \[ = S \int \left( \frac{e^n x^n - 1}{x^n - 1} \right)^{\frac{1}{n}} dx, \] where the notation \( S \int \) is employed to express with brevity the sum of two (or any number) of similar integrals. The sums of many other integrals might be found in the same manner; but I proceed to more general inquiries. § 3. The first idea of a more extended method occurred to me about fifteen years ago, when pursuing mathematical studies at Cambridge; and it was suggested by an attentive consideration of the process by which Fagnani had rectified the hyperbola, as mentioned in the preceding section. The question occurred to me, whether it might not be possible to combine three integrals in a similar manner, by supposing two symmetrical equations to exist between three variables? Since we have \[ x y z + C = \int y z \, dx + \int x z \, dy + \int x y \, dz, \] if we suppose any two equations to exist between the variables, then \( y \) and \( z \) are functions of \( x \) which assume definite values when \( x \) is given. Therefore also the product \( y z \) is a function of \( x \), which may be called \( \phi x \). If now the two equations are symmetrical, it follows that the letters \( x, y, z \) may be permuted; which gives \( x z = \phi y \), and \( x y = \phi z \); \[ \begin{align*} x y z + C &= \int \phi x \, dx + \int \phi y \, dy + \int \phi z \, dz \\ &= S \int \phi x \, dx. \end{align*} \] It is evident that this reasoning may be extended to any number \( n \) of variables between which there exist \( n - 1 \) symmetrical equations, which circumstance renders them all similar functions of each other. Let \( r \) designate their product \( x y z \ldots \), and therefore \( \frac{r}{x} = y z \ldots \), or the product of all except \( x \). \[ \begin{align*} r + C &= \int \frac{r}{x} \, dx + \int \frac{r}{y} \, dy + \ldots, &c. \end{align*} \] But if \( \frac{r}{x} = \phi x \), we have by merely permuting the letters, \[ \begin{align*} \frac{r}{y} &= \phi y, \frac{r}{z} = \phi z, &c. \end{align*} \] Therefore \[ \begin{align*} r + C &= \int \phi x \cdot dx + \int \phi y \cdot dy + \ldots, &c. \\ &= S \int \phi x \cdot dx. \quad (A.) \end{align*} \] This equation I first obtained in the year 1821, but not having leisure at that time to pursue the subject much further, I contented myself with making a note of it as being a subject that deserved to be further examined into. I afterwards found it to be the key, as it were, of the whole method. In the year 1825 I resumed this investigation, and endeavoured, by the trial of various forms of symmetrical equations between the variables, to see whether this method would lead to new results, or whether, on the contrary, it would turn out to be a mere variation of the methods in common use. I here give the results of some of these early trials, just as I find them in the original papers. **Ex. 1.** Let the 2 symmetrical equations be (1.) \( x + y + z = a \) (2.) \( x^2 + y^2 + z^2 = b^2 \), \( a \) and \( b^2 \) being constants. These give \[ \varphi x \text{ or } y z = \frac{a^2 - b^2}{2} - ax + x^2. \] And theorem A. gives (3.) \( xy z + C = \int \varphi x \, dx + \int \varphi y \, dy + \int \varphi z \, dz. \) Now here it is easy to verify the theorem, because \( \int \varphi x \, dx \) is known, viz. \[ \int \varphi x \, dx = \frac{a^2 - b^2}{2} x - \frac{a}{2} x^2 + \frac{x^3}{3}, \] and similarly with respect to \( \int \varphi y \, dy \) and \( \int \varphi z \, dz \); \( \therefore \) by addition, \[ 8 \int \varphi x \, dx = \frac{a^2 - b^2}{2} (x + y + z) - \frac{a}{2} (x^2 + y^2 + z^2) + \frac{x^3 + y^3 + z^3}{3}, \] or, by help of equations (1.) and (2.), \[ 8 \int \varphi x \cdot dx = \text{const.} + \frac{x^3 + y^3 + z^3}{3}. \] Although this result differs at first sight from that given by equation (3.), \[ 8 \int \varphi x \, dx = \text{const.} + xy z, \] yet it is easy to see that they are identical. For since \[ x + y + z = \text{const.}, \text{ by equation (1.)}, \] and \[ x^2 + y^2 + z^2 = \text{const.}, \text{ by equation (2.)}, \] it follows as a necessary consequence that \[ \frac{x^3 + y^3 + z^3}{3} = xy z + \text{const.}, \] which verifies the theorem in this case. In the examples which follow next I shall suppose one of the given equations to be \[ x + y + z = 0. \] Ex. 2. Let the other equation be \((x^2 - 1)(y^2 - 1)(z^2 - 1) = -1\), we find \[ \varphi x, \text{ or } y z = -1 + \sqrt{\frac{1 + x^2 - x^4}{1 - x^2}}. \] Ex. 3. Let \(x^4 + y^4 + z^4 = 2xy z\), we find \[ 2yz = 2x^2 + x + \sqrt{4x^3 + x^2} = 2\varphi x. \] Ex. 4. Let \(x^5 + y^5 + z^5 = -5\), we find \[ yz = \frac{x^2}{2} + \sqrt{\frac{1}{x} + \frac{x^4}{4}} = \varphi x. \] In each of these cases therefore we find the sum of three integrals of the form \( \int \varphi x \cdot dx \) to be equal to \( xy z + C \). Before going further it may be well to adopt, for the sake of brevity, the following notation. Let there be any number of variables, three for example; then the sum of their \( n \)th powers or \( x^n + y^n + z^n \) may be briefly written \( Sx^n \), and similarly if \( f(x) \) be any function of \( x \), \( Sf(x) \) stands for the sum of \( f(x) + f(y) + f(z) \). Also \( Sxy \) means \( xy + xz + yz \). And in general \[ Sf(x,y) = f(x,y) + f(y,x) + f(x,z) + f(z,x) + f(y,z) + f(z,y), \] being the sum of all the permutations of the letters. A few examples will render this notation familiar. Let there be 3 variables, then \[ Sx = x + y + z \\ Sxy = xy + xz + yz \\ Sx^2y = x^2y + y^2x + x^2z + z^2x + y^2z + z^2y \\ Sx^2y^2 = x^2y^2 + x^2z^2 + y^2z^2. \] Let \( r = xyz \). \[ Sr = yz + xz + xy = Sxy \\ \frac{1}{r} = \frac{1}{r} Sxy \\ \frac{1}{r^3} = \frac{1}{r^3} Sx^2y^2 \\ \frac{1}{r^n} = \frac{1}{r^n} Sax^nyn, \text{&c. &c.} \] Let there be 5 variables, \( u, v, x, y, z \), then \[ Suvxy = uvxy + uvxz + uvyz + uxyz + vxyz, \text{&c. &c. &c.} \] The greater the number of the variables, the greater is the advantage of this abbreviated notation. To resume our examples: **Ex. 5.**—Let \( Sx^2y^2 + axyz + bSxy = c \). Then supposing, as before, that \( x + y + z = 0 \), we find \[ 2yz = 2x^2 - ax - b + \sqrt{(4c + b^2) + 2abx + a^2x^2 - 4ax^3}. \] By properly determining the constants \( a, b, c \), this radical may be made to agree with any proposed cubic \( \sqrt{x^3 + ax^2 + bx + c} \). **Ex. 6.**—Let \( x^6 + y^6 + z^6 = 0 \), or \( Sx^6 = 0 \). Here \( yz \) or \( \varphi x \) is an implicit function of \( x \), only determinable by the solution of the cubic equation \[ (m - x^2)^3 = \frac{3}{2} x^2 m^2, \] where \( m = yz \). Notwithstanding the complicated nature of the function $\phi x$, we still have $$S \int \phi x \cdot dx = xyz + C.$$ But the most interesting result was obtained by combining the equations $$x + y + z = 0$$ $$xy + xz + yz = -\frac{x^2 y^2 z^2}{4};$$ or, in the compendious notation, putting $xyz = r$, $$Sx = 0 \quad Sxy = \frac{-r^2}{4};$$ whence $$yz = \frac{2\sqrt{1 + x^4}}{x^2} - \frac{2}{x^2} = \phi x,$$ and by the general theorem A $$xyz + C = S \int \phi x \cdot dx. \quad \ldots \ldots \ldots \ldots (1.)$$ But $\int \phi x \cdot dx$ consists of two parts, of which the latter is $\int \frac{-2dx}{x^3} = \frac{2}{x}$. Therefore the sum of three such portions $= \frac{2}{x} + \frac{2}{y} + \frac{2}{z} = 2S \frac{1}{x} = \frac{2Sxy}{r} = \frac{-r}{2}$; since by hypothesis $Sxy = \frac{-r^2}{4}$. Hence if we put $$\int \frac{\sqrt{1 + x^4}}{x^2} \cdot dx = \int \psi x \cdot dx,$$ equation (1.) becomes $$r + C = 2S \int \psi x \cdot dx - \frac{r}{2};$$ whence $$S \int \psi x \cdot dx = \frac{3}{4} r + \text{const.}$$ Now this result is highly deserving of attention; for the integral which we have here called $\int \psi x \cdot dx = \int \frac{\sqrt{1 + x^4}}{x^2} \cdot dx$, is no other than the arc of an equilateral hyperbola whose abscissa is $x$, the equation of the curve being referred to its asymptotes. When I arrived at this result, I immediately perceived that (provided there were no error in the reasoning, of which I at first entertained some doubts,) it was an entirely new and undiscovered property of the hyperbola. I therefore proceeded to verify it by calculating numerical examples. The theorem may be stated thus: If three abscissae of an equilateral hyperbola verify the equations $Sx = 0$, $Sxy = \frac{-r^2}{4}$, the sum of the arcs subtended by those abscissae $= \frac{3}{4} r + \text{const.}$ In order to eliminate the constant, the value of which was unknown, I supposed one of the abscissae $x$ to assume some other value $x'$, and there- fore a corresponding change to take place with respect to \( y \) and \( z \) (since they are functions of \( x \)). These new values may be called \( y' \) and \( z' \). At the same time the product \( xyz = r \) is changed to \( x'y'z' = r' \); and the first set of arcs, which may be denoted by \( A \), are changed for a second set, which may be denoted by \( B \). Now the original equation gives, sum of arcs \( A = \frac{3}{4}r + C \), and the changed equation gives, sum of arcs \( B = \frac{3}{4}r' + C \); \( \therefore \) by subtraction, sum of arcs \( A - \) sum of arcs \( B = \frac{3}{4}(r - r') \), in which result the constant is eliminated. The accompanying figure 1. represents two opposite equilateral hyperbolas, with their asymptotes. \( C \) is the centre, and origin of the abscissæ \( CX = x, CY = y, CZ = z \), of which the latter must be negative (supposing the two former positive), by reason of the equation \( x + y + z = 0 \). Therefore it belongs to the opposite hyperbola. If \( XP \) is the ordinate corresponding to the abscissa \( CX \), the equation of the curve is \( CX \cdot XP = 1 \), and the arc subtended by \( CX \) is the infinite arc \( OP \). Fig. 2. represents the abscissa \( CX = x \), both in its original and in its altered state when it has become \( CX' = x' \). In the former case it subtends the infinite arc \( OP \), and in the latter case the infinite arc \( OP' \). But in taking the difference there remains the portion of abscissa \( XX' \) subtended by the arc \( PP' \), which is a finite quantity, and thus the embarrassing consideration of the infinite arcs is avoided. Now the sum of arcs \( A - \) sum of arcs \( B = \) sum of three limited arcs, of which \( P P' \) is one, and the others subtend the portions of abscissæ \( YY' \) and \( ZZ' \). Denoting these arcs by \( K \), we have this equation in finite terms: \[ \text{Sum of arcs } K = \frac{3}{4}(r - r'). \] Now in order to put this equation to the test of numerical computation, it is requisite to find three quantities that verify the equations \[ Sx = 0 \quad Sxy = \frac{-r^2}{4} \] Suppose, therefore, \[ x = 1 \\ y = 1.7535 \\ z = -2.7535, \] whence \[ xyz = -4.8281 = r. \] The equations are satisfied by these values, and also by the following: \[ x' = 1.1 \\ y' = 1.5826 \\ z' = -2.6826, \] whence \[ x' y' z' = -4.670 = r'. \] \[\begin{align*} [4.] & \quad \therefore x' - x = 0.1 \\ & \quad y' - y = -0.1709 \\ & \quad z' - z = 0.0709 \\ & \quad r' - r = 0.1581. \end{align*}\] Now we can, without difficulty, calculate the approximate value of the arc (P P' in fig. 2.) subtended by the portion of abscissa \(x' - x\) (X X' in the figure). Calling this the arc \((x)\), we have \[ \text{Arc}(x) = 0.1351 \] \[ \text{Arc}(y) = 0.1817 \] \[ \text{Arc}(z) = 0.0715. \] And according to the theorem we ought to have \[ \text{Sum of arcs} = \frac{3}{4} (r' - r) = 0.118. \] But in this example, arc \((x)\) is to be accounted negative. Therefore we have \[ \text{Arc}(y) + \text{Arc}(z) = 0.253 \] \[ - \text{Arc}(x) = 0.135 \] \[ \text{Sum} = 0.118 \] which is in accordance with the theorem. Second example.—Suppose, as before, \[ x = 1 \] \[ y = 1.7535 \] \[ z = -2.7535, \] whence \[ x y z = -4.8281. \] And also \[ x' = 2 \] \[ y' = 0.8875 \] \[ z' = -2.8875, \] whence \[ x' y' z' = -5.1253; \] both of which systems of values satisfy the given equations of condition. \[\begin{align*} [5.] & \quad \therefore x' - x = 1 \\ & \quad y' - y = -0.866 \\ & \quad z' - z = -0.134 \\ & \quad r' - r = -0.297. \end{align*}\] By calculation we find \[ \text{Arc}(x) = 1.1319 \] \[ \text{Arc}(y) = 1.0443 \] \[ \text{Arc}(z) = 0.1350. \] But in this example both arc \((x)\) and arc \((z)\) are negative. Therefore we have \[- \text{Arc}(x) - \text{Arc}(z) = -1.267\] \[+ \text{Arc}(y) = +1.044\] \[\text{Sum} = -0.223\] Also \[r' - r = -0.297\] \[\therefore \frac{3}{4} (r' - r) = -0.223\] in accordance with the theorem. I had at first some difficulty to perceive the reason why some of the arcs were to be considered negative rather than the others. This question was one of a novel nature, which had not hitherto occurred to analysts, and therefore no solution of it was to be met with in books. On the other hand, to leave so essential a point without any demonstration was unsatisfactory. But the following considerations appeared to afford an explanation of this fact. In the first example, since \(x + y + z = 0\), and both \(x\) and \(y\) are positive quantities, \(z\) must be negative. Therefore \(xy\) is positive, but \(xz\) and \(yz\) are negative. Now we have \[\frac{yz}{2} = \frac{\sqrt{1 + x^4} - 1}{x^2},\] which therefore must be negative: \(\therefore\) also \(\sqrt{1 + x^4} - 1\) must be negative. But this quantity would necessarily be positive if the radical had a positive sign. Therefore the radical must have a negative sign. Reasoning in the same manner, because \(\frac{xz}{2} = \frac{\sqrt{1 + y^4} - 1}{y^2}\), this radical has necessarily a negative sign; and because \(\frac{xy}{2} = \frac{\sqrt{1 + z^4} - 1}{z^2}\), this radical has a positive sign. Attributing, therefore, these signs to the radicals, the three hyperbolic arcs are respectively, \[-\int \frac{dx}{x^2} \sqrt{1 + x^4}, \quad -\int \frac{dy}{y^2} \sqrt{1 + y^4}, \quad +\int \frac{dz}{z^2} \sqrt{1 + z^4}.\] On the other hand, during the change of \(x, y, z\) to \(x', y', z'\) respectively, we have seen that \(x\) and \(z\) increase while \(y\) diminishes (see equations [4.]); \(\therefore dx\) and \(dz\) are to be accounted positive, and \(dy\) negative. This consideration renders it necessary to write \[[6.] \quad -\int \frac{dx}{x^2} \sqrt{1 + x^4}, \quad +\int \frac{dy}{y^2} \sqrt{1 + y^4}, \quad +\int \frac{dz}{z^2} \sqrt{1 + z^4}.\] And thus the assertion that arc \((x)\) is to be considered negative in this example is justified. Second example.—Here the reasoning remains the same as far as regards the signs of the different radicals; but it appears from the equations [5.], that while \(x\) increases, both \(y\) and \(z\) diminish, \(\therefore dx\) is positive, and \(dy\) and \(dz\) are negative. Therefore the only difference between this example and the former respects the sign of \(dz\). Therefore equation [6.] must be written which justifies the assertion, that in this example arcs \( (x) \) and \( (z) \) are to be accounted negative, and arc \( (y) \) positive. Perhaps this reasoning may not be altogether free from objection: I wish it, therefore, to be remembered that I am not giving it here as being the most convenient method of determining the signs of the arcs, but merely as being the reasoning I employed at the time* when I first met with this theorem. This theorem shows that three hyperbolic arcs may be determined in an infinity of ways, so that their sum may be an algebraic quantity. At the same time it shows that one of these arcs cannot be supposed always to be 0, so that Fagnani's theorem respecting the sum of two arcs is not an instance or particular case of this. I have dwelt at some length on this theorem, because the theory of the conic sections has always been regarded as so important by mathematicians that any considerable addition to it is thought deserving of attention. I now proceed to other results which presented themselves in the course of this inquiry. Still continuing to suppose the variables to be three in number, it is allowable to suppose between them any two symmetrical equations whatever; and thence if we can deduce the value of \( yz \) or \( \varphi x \) in terms of \( x \) alone, we may apply the general theorem A. **Ex. 1.** Let \( Sx = a \), and \( Sxy = \left( \frac{xyz}{2} \right)^2 \), we find \[ \frac{yz}{2} = \frac{1}{x^2} + \frac{\sqrt{1 - x^4 + ax^3}}{x^3}. \] **Ex. 2.** Let \( Sx = a \), and \( Sxy = \sqrt{2b} \cdot xyz \), we find \[ yz = x^2 - a - b \cdot x + \sqrt{2b} \cdot x^3 + (b^2 - 2ab)x^2. \] **Ex. 3.** Let \( \sqrt{x} + \sqrt{y} + \sqrt{z} = \sqrt{a} \) (or \( S\sqrt{x} = \sqrt{a} \)), and let \( Sx = b \), we find \[ \sqrt{yz} = \frac{a - b}{2} - \sqrt{ax} + x = f x, \] whence \[ yz = \varphi x = [fx]^2. \] A great variety of different suppositions of this sort may be made; but if the resulting function \( \varphi x \) should become too complicated, little practical advantage would be derived from the knowledge of its properties. I therefore thought of another method of obtaining this function, by means of what may be termed "changing the conditions." Thus let the original equations of condition be \[ x + y + z = 0, \quad \text{and} \quad xy + xz + yz = -1, \] * 1825. whence this equation results, \[ yz = x^2 - 1. \] Now for \( x, y, z \) write their cubes (both in the original and in the resulting equation), and they become \[ [7.] \quad x^3 + y^3 + z^3 = 0 \\ x^3 y^3 + x^3 z^3 + y^3 z^3 = -1 \] and \[ y^3 z^3 = x^6 - 1. \] Taking the cube root of this, we have \[ yz = \sqrt[3]{x^6 - 1}. \] Whence it follows that the sum of three integrals \[ \int dx \sqrt[3]{x^6 - 1} + \int dy \sqrt[3]{y^6 - 1} + \int dz \sqrt[3]{z^6 - 1} = xyz + C, \] whenever \( x, y, z \) satisfy the two given equations of condition [7.], which may be briefly written \[ Sx^3 = 0 \quad Sy^3 = -1. \] Here we changed the conditions, by writing \( x^3 \) for \( x \). We might have written \( x^n \) for \( x \), and thereby obtained a more general result*. Even values of \( n \) must, however, be excluded, because the equation \( x^n + y^n + z^n = 0 \) would otherwise be impossible. **Ex. 4.** Let \( Sx = a \), and \( S \frac{1}{x} = \frac{1}{b} \), whence \[ yz = \frac{abx - bx^2}{x - b}. \] Now if we write for \( x, y, z, a, b \), their square roots, these three equations become \[ \sqrt{x} + \sqrt{y} + \sqrt{z} = \sqrt{a}, \] \[ \sqrt{\frac{1}{x}} + \sqrt{\frac{1}{y}} + \sqrt{\frac{1}{z}} = \sqrt{\frac{1}{b}}, \] and \[ \sqrt{yz} = \frac{\sqrt{abx} - \sqrt{b} \cdot x}{\sqrt{x} - \sqrt{b}} = f(x), \] whence \[ yz = \varphi(x) = [fx]^2. \] Many interesting theorems may be obtained by this method of "changing the original conditions," but these examples of it will suffice for the present. I now perceived that the hypothesis upon which my method was grounded, viz. that \( n - 1 \) symmetrical equations existed between \( n \) variables, was the same thing as to suppose that these variables were the roots of an equation of \( n \) dimensions, one of whose coefficients at least was variable, the others being either constants, or functions of the variable one. This consideration introduced a great degree of clearness and * Viz. the sum of three integrals, like \( \int dx \sqrt[3]{x^{2n} - 1} \). simplicity into the subject, besides facilitating in no ordinary degree the progress of research. For instance, suppose there are 3 variables, and let \( x + y + z = p \), \( xy + xz + yz = q \) and \( xyz = r \), then \( x, y, z \) are the roots of the equation \[ u^3 - pu^2 + qu - r = 0, \] where the variable \( u \) denotes indifferently either of the variables \( x, y, \) or \( z \). This new letter \( u \) is only introduced for the sake of clearness, since we may equally well say that \( x, y, z \) are the roots of the equation \[ x^3 - px^2 + qx - r = 0, \] or of \[ y^3 - py^2 + qy - r = 0, \text{ &c.} \] This latter mode of expression is often more convenient. Now the function \( \phi x \), which we wish to determine, \[ yz = \frac{xyz}{x} = \frac{r}{x}; \] and since \( p, q \) are here supposed to be given functions of \( r \), we may find the value of \( \frac{r}{x} \) in terms of \( x \), provided we can solve the algebraic equation \[ x^3 - px^2 + qx - r = 0, \] with respect to \( r \). **Example.** Let us resume the question concerning the sum of 3 arcs in the equilateral hyperbola. The equations of condition were \[ x + y + z = 0, \] \[ xy + xz + yz = \frac{-x^2 y^2 z^2}{4}, \] or \[ p = 0, q = \frac{-r^2}{4}, \] \(\therefore x, y, z\) are the roots of \[ x^3 - \frac{r^2}{4} x - r = 0. \] This equation (arranged according to the powers of \( r \)) is \[ \frac{-x}{4} x^2 - r + x^3 = 0, \] or \[ r^2 + \frac{4}{x} r - 4 x^2 = 0, \] whence \[ r = \frac{-2}{x} + 2 \sqrt{x^2 + \frac{1}{x^2}}, \] and \[ \frac{r}{x} = \frac{-2}{x^2} + \frac{2 \sqrt{1 + x^4}}{x^2} = \phi x, \] which agrees with the former result. But now we are able to point out with clearness the limits of the possibility of the theorem. For the cubic equation \[ x^3 - \frac{r^2}{4} x - r = 0 \] being compared with the form \[ x^3 - ax + b = 0, \] must have impossible roots when \( \frac{b^2}{4} > \frac{a^3}{27} \), or when \( \frac{r^2}{4} > \frac{r^6}{64 \times 27} \), or \( 1 > \frac{r^4}{16 \times 27} \), or \( 16 \times 27 > r^4 \). Hence it appears that there are impossible roots whenever \( r \) is less than \[ \sqrt[4]{16 \times 27} = 2 \sqrt[4]{27} = \pm 4.559. \] Accordingly in our numerical examples it will be seen that the values* of \( r \) are not contained within these limits. Another example. We have found (page 183. Ex. 3.) that the suppositions \( Sx = 0 \), \( Sx^4 = 2r \), give \[ 2\varphi x = 2x^2 + x + \sqrt{4x^3 + x^2}. \] Here \( x, y, z \) are roots of \( x^3 + qx - r = 0 \), and we easily find from the doctrine of equations that \[ Sx^4 = 2q^2, \quad \therefore 2q^2 = 2r, \quad \therefore q^2 = r. \] Therefore \( x + qx - q^2 = 0 \). Solving this quadratic equation with respect to \( q \) we have \[ q = \frac{x}{2} + \frac{1}{2} \sqrt{x^2 + 4x^3}. \] But since \[ x^3 + qx - r = 0, \quad \frac{r}{x} = x^2 + q. \] \[ \therefore \frac{2r}{x} = 2x^2 + x + \sqrt{x^2 + 4x^3} = 2\varphi x, \] which agrees with the former result. Another example. Let \( x^3 + qx - r = 0 \), which gives for the first condition \[ Sx = 0. \] And let the second condition be \[ r^2 + cr = q^2 + aq + b, \] \( a, b, c \) being constants. We find \[ \frac{r}{x} = \frac{x^2 + \frac{c}{2}x - \frac{a}{2} + \sqrt{X}}{1 - x^2}, \] where \( X \) is a polynomial of 6 dimensions. Now let the \( n \) variables \( x, y, z \ldots \) be roots of \( x^n - px^{n-1} + \ldots \pm r = 0 \), where I continue to denote the product of all the roots by \( r \); we have still \( \varphi x = \frac{r}{x} \). Let the coefficients \( p, \&c. \&c. \) be replaced by their values in terms of \( r \) (which are supposed given, by means of \( n - 1 \) equations of condition). Then let the equation be arranged according to the powers of \( r \), and the solution of it will give the value of \( r \) in terms of \( x \), and therefore the value of \( \frac{r}{x} = \varphi x \). * These were \( r = -4.83, r = -4.67, r = -5.13. \) Now if it be considered that this method extends to any number whatever of variables, and that the coefficients of the equation may be any functions of each other that we please to make them, it will appear at once how wide a field of inquiry here opens before us. It was the wish to reduce these extensive but rather complicated results to something like a clear and connected system which obliged me to defer the publication of them longer than I should otherwise have wished, by which means I lost the priority which at one time was in my power of announcing the existence of this new branch of analysis; for the results hitherto mentioned, together with many others, which for the sake of brevity I omit, were obtained in the years 1825 and 1826, and consequently two or three years previously to the publication of Abel's theorem. And it will be observed that they comprise large classes of integrals which are not contained in his formula $$\int \frac{P \, dx}{\sqrt{R}}.$$ Of this I have given an instance in the integral, $$\int dx \sqrt[3]{x^6 - 1},$$ and the more general one, $$\int dx \sqrt{x^{2n} - 1}.$$ But an unlimited number of such forms may be found by the method I have pointed out of "changing the conditions" at first established between the variables. We may conclude therefore that if $x, y, z \ldots$ are the roots of an equation of $n$ dimensions, having at least one variable coefficient, and if we can find the function $\phi x = \frac{r}{x}$ in terms of $x$, we may thence deduce the algebraic sum of the $n$ integrals, $$\int \phi x \cdot dx + \int \phi y \cdot dy + \ldots.$$ But the inverse problem still remains. Given the function $\phi x$, or the integral $\int \phi x \cdot dx$, to find the equation $$x^n - p x^{n-1} + \ldots \pm r = 0,$$ of which $x, y, z \ldots$ must be roots, in order that $\int \phi x \cdot dx$ may have an algebraic sum? This is evidently the most important part of the subject, for in applying the method to practice the form of the function $\phi x$ is given beforehand. That this research requires methods of its own will appear at once from a simple example. Let $\int \sqrt{1 + x^n} \cdot dx$ be the proposed integral, where $n$ is any whole number. Let us first suppose $$\sqrt{1 + x^n} = \phi x = \frac{r}{x}.$$ This gives \[ r = x \sqrt{1 + x^n}, \] or \[ x^n + 2x^2 - r^2 = 0, \] an equation, the product of whose roots must be \( \pm r^2 \). But by the hypothesis this product is always represented by \( r \). Whence it follows that no solution of the required problem is effected by the supposition \( \sqrt{1 + x^n} = \frac{r}{x} \). And at the same time we see that in order for any supposition to be successful, it is necessary that the resulting equation, arranged according to the powers of \( x \), should have \( r \) for its last term. Now let us remark, that if \( S \int \psi x \cdot dx \) has an algebraic sum, then \( S \int (\psi x + x^a) \cdot dx \) has likewise an algebraic sum. For it equals the former sum, with the addition of \( S \int x^a \cdot dx \), or \( \frac{1}{a+1} (x^{a+1} + y^{a+1} + z^{a+1} + \ldots) \), which is an algebraic quantity. In the same way we see that \( S \int (\psi x + m x^a + n x^b + \ldots) \cdot dx \) has an algebraic sum if \( m, n, a, b, \) &c. are constants, and if there are any number of such simple terms of the form \( m x^a \). Hence if the proposed integral be \( \int \psi x \cdot dx \), and the supposition \( \varphi x \) or \( \frac{r}{x} = \psi x \) does not succeed, we are led to try the suppositions \[ \frac{r}{x} = \psi x + x, \quad \frac{r}{x} = \psi x + x + x^2, \text{ &c. &c.} \] **Example.** Let the integral \( \int \psi x \cdot dx = \int \sqrt{1 + x^n} \cdot dx \) as before. Suppose \( \sqrt{1 + x^n} = \frac{x}{2} + \sqrt{1 + r} \), whence we deduce \[ x^n - \frac{x^3}{4} - \sqrt{1 + r} \cdot x - r = 0, \] an equation which has \( n \) roots, whose product is \( r \). We also find \[ 1 + r = 1 + x^n - x \sqrt{1 + x^n} + \frac{x^3}{4}, \] whence \[ \frac{r}{x} = x^{n-1} + \frac{x}{4} - \sqrt{1 + x^n}, \] or \[ \varphi x = x^{n-1} + \frac{x}{4} - \psi x. \] And therefore since \( S \int \varphi x \cdot dx \) has an algebraic sum \( = r + \text{const.} \), it follows that \( S \int \psi x \cdot dx \) has an algebraic sum also, viz. \[ S \int x^{n-1} \, dx + \frac{1}{4} S \int x \, dx - (r + \text{const.}) \] or \[ \frac{1}{n} S x^n + \frac{1}{8} S x^2 - (r + \text{const.}) \] But if the form of the proposed integral \( \int \psi x \cdot dx \) is complicated, no doubt it would be difficult to find an equation like \[ \frac{r}{x} = \pm \psi x + m x^a + n x^b + \ldots., \] such that when developed and arranged according to the powers* of \( x \) its last term should be \( r \). Probably this is not possible in general. And yet the proposed integrals \( S \int \psi x \, dx \) may have an algebraic sum. For hitherto we have tacitly supposed that this algebraic quantity, if it existed, was the product of the variables \( = r + \text{const.} \), since we have derived all our reasonings from the theorem \[ xyz \ldots = \int yz \ldots \, dx + \int xz \ldots \, dy + \&c. \] But it is evident that the algebraic sum may as well have any other form as the one in question. It may be a constant or any symmetrical combination of the variables. The foundation of our reasoning has therefore hitherto been too limited, and requires to be extended. Let us therefore direct our inquiries to the attainment of a more general method. § 4. Exposition of a more general method. If \( x, y, z \ldots \) are the roots of any equation, \[ x^n - p x^{n-1} + p' x^{n-2} \ldots = 0, \] then not only the coefficients themselves \( p, p', p'', \&c. \), but also all combinations of them, are symmetrical functions of the roots. Let \( v \) be a general symbol denoting any one of these coefficients or of these combinations. Then \( v \) may be considered either as a function of all the roots, or of only one of them. And in the latter case this root may be changed for any of the others without causing any alteration in the value of \( v \). Example. Let there be two variables \( x \) and \( y \), roots of \( x^2 - vx + 1 = 0 \), which may be also written \( y^2 - vy + 1 = 0 \). Then \( v \) if considered as a function of both \( x \) and \( y \), is equal to \( x + y \), the sum of the roots. But if considered as a function of \( x \) alone, it is \( \frac{1 + x^3}{x} \). And if considered as a function of \( y \) alone, it is \( \frac{1 + y^3}{y} \). \[ \therefore \frac{1 + x^3}{x} = \frac{1 + y^3}{y}, \] * The coefficient of the highest power of \( x \) being always supposed \( = 1 \). or \( x \) may be permuted for \( y \). Hence also \[ \varphi \cdot \frac{1 + x^2}{x} = \varphi \cdot \frac{1 + y^2}{y}, \] \( \varphi \) being any function. Quantities which (like \( \varphi \cdot \frac{1 + x^2}{x} \) in this example) are not changed in value by permuting the roots, maybe termed "symmetrical functions" of the variables \( x, y, z, \&c. \) or simply "symmetricals" of the equation whose roots are \( x, y, z \ldots \). Thus the quantity \( \varphi \left( \frac{1 + x^2}{x} \right) \) is a symmetrical of the equation \[ x^2 - vx + 1 = 0. \] But the quantity \( \varphi \left( \frac{1 - x^2}{x} \right) \) is not a symmetrical of it, because \( \frac{1 - x^2}{x} \) is not equal to \( \frac{1 - y^2}{y} \). **Ex. 2.** Let \( x, y, z \), be roots of \( x^3 - vx + 1 = 0 \), which may also be written \( y^3 - vy + 1 = 0 \), or \( z^3 - vz + 1 = 0 \), whence \[ v = \frac{1 + x^3}{x} = \frac{1 + y^3}{y} = \frac{1 + z^3}{z}, \] whence also \[ \varphi \cdot \frac{1 + x^3}{x} = \varphi \cdot \frac{1 + y^3}{y} = \varphi \cdot \frac{1 + z^3}{z}. \] Therefore the quantity \( \varphi \cdot \frac{1 + x^3}{x} \) is a symmetrical of the equation \( x^3 - vx + 1 = 0 \); but \( \varphi \cdot \frac{1 + x^2}{x} \) is not a symmetrical of it, because \( \frac{1 + x^2}{x} \) is not equal to \( \frac{1 + y^2}{y} \). These things being premised, it is evident that the same quantity may be a symmetrical of one equation and not so of another. Therefore the problem arises: Any quantity being given, to find the equation with respect to which it is symmetrical? **Ex. 1.** Let \( \frac{1 + x^2}{x} \) be the given quantity. Put \( \frac{1 + x^3}{x} = v \), \( v \) being a general symbol for any symmetrical quantity: \[ \therefore x^2 - vx + 1 = 0 \] is the required equation, and the indeterminate \( v \) is thereby determined to be the sum of the roots. **Ex. 2.** Let \( \frac{1 + x^3}{x} \) be the given quantity. Put \( \frac{1 + x^3}{x} = v \): \[ \therefore x^3 - vx + 1 = 0 \] is the required equation, and \( -v \) is thereby determined to be the sum of the products of every two roots*. --- * **Another example.**—Let \( \frac{1 + x + x^2}{1 - x} \) be the given quantity. Put it \( = v \): \[ \therefore x^2 + (1 + v)x + (1 - v) = 0 \] is the required equation, and \( (1 + v) \) is determined to be the sum of the roots \( x + y \), and \( 1 - v \) their product \( xy \). Whence, by eliminating \( v \), we find the following relation between \( x \) and \( y \): \( xy - (x + y) = 2 \). This example will be referred to hereafter. Ex. 3. Let $x + \sqrt{1 - x^2}$ be the given quantity. Put $x + \sqrt{1 - x^2} = v$: $$\therefore (v - x)^2 = 1 - x^2$$ $$2x^2 - 2vx + (v^2 - 1) = 0$$ $$x^2 - vx + \frac{v^2 - 1}{2} = 0$$ is the required equation, and $v$ is determined to be the sum of its roots. Also $\frac{v^2 - 1}{2}$ is equal to their product $xy$. In other words, the equation must be such, that its roots $x, y$, answer the condition $xy = \frac{(x + y)^2 - 1}{2}$. Ex. 4. Let $x + \sqrt{\frac{1}{x}}$ be the given quantity. Put $x + \sqrt{\frac{1}{x}} = v$: $$\therefore (v - x)^2 = \frac{1}{x}$$ $$\therefore x^3 - 2vx^2 + v^2x - 1 = 0$$ is the required equation, which is thereby determined to be a cubic, the product of whose roots = 1, and $v$ is found to be half the sum of the roots. Case of exception.—It is essential to remark, that when the given quantity contains only one power of $x$, it cannot be a symmetrical. Ex. $\sqrt{1 + x^n}$ cannot be a symmetrical; for if it could, we should have $\sqrt{1 + x^n} = \sqrt{1 + y^n} = \sqrt{1 + z^n} = &c.$, whence $x = y = z = &c.$; whereas we suppose the roots to be in general all different from one another. With this exception the required equation may be easily found in most cases by putting the given quantity, or $fx = v$. And if the roots of the equation thus found are denoted by $x, y, z, . . . .$, it is an immediate consequence of the hypothesis that $fx = fy = fz = &c.$ Thus in the last example we have $$x + \sqrt{\frac{1}{x}} = y + \sqrt{\frac{1}{y}} = z + \sqrt{\frac{1}{z}}.$$ Let us now suppose that $S dx$, the sum of the differentials of the roots, or $dx + dy + dz &c.$, is multiplied by a symmetrical, that is, by one of the above-mentioned quantities $fx$. The product is $fx \cdot dx + fy \cdot dy + fz \cdot dz + &c.$ But in consequence of the equality $fx = fy = fz = &c.$ the result is the same, if the first term is multiplied by $fy$, the second by $fz$, the third by $fx$, and so on. So that the product is $fx \cdot dx + fy \cdot dy + fz \cdot dz + &c.$, which is our abbreviated notation $= Sfx \cdot dx$. $$\therefore fx \cdot S dx = Sfx \cdot dx. . . . . . . . . . . . . . . . (1.)$$ This theorem is of the greatest importance, and will be of constant use in the sequel. It must not be forgotten that it is only true when $fx$ is a symmetrical, and therefore capable of being represented by $v$. Replacing $fx$ by $v$, it becomes $v S dx = Sv dx$. In this form it is self-evident, because $v$ remains the same, however the letters $x, y, z, . . .$ are permuted. More generally, if the quantity $S \psi x \cdot dx$, which means $\psi x \cdot dx + \psi y \cdot dy + \psi z \cdot dz + \&c.$, is multiplied by $f \cdot x$, the product may be exhibited in the form $$f \cdot x \psi x \cdot dx + f \cdot y \psi y \cdot dy + f \cdot z \psi z \cdot dz + \&c.,$$ which in our notation is $S f \cdot x \psi x \cdot dx$. Whence the theorem $$f \cdot x S \psi x \cdot dx = S f \cdot x \psi x \cdot dx,$$ which may also be put in the self-evident form $$v S \psi x \cdot dx = S v \psi x \cdot dx.$$ Equation (1.) is a corollary from (2.) when $\psi x = 1$. These results may be comprised in a general rule, viz. that whatever be the nature of the differential $\psi x \cdot dx$, if we multiply the sum of a series of such quantities, or $S \psi x \cdot dx$, by $f \cdot x$ any function of $x$, the multiplication is effected by introducing $f \cdot x$ within the sign $S$, provided (and this is the essential condition) that $f \cdot x$ is a symmetrical of that equation of which all the variables are roots. It is upon this principle that the method which I am about to explain chiefly reposes. Suppose $\int X \cdot dx$ to be the proposed integral, $X$ being any function of $x$. In the first place we have to determine the number of the other variables $y, z, \&c.$, and also the nature of the equation $x^n - p \cdot x^{n-1} + \&c. = 0$, of which they are roots. And this may in general be accomplished by the following process: Assume $X$ to be a symmetrical of this unknown equation, or that $X = v$; then if this equation $X = v$ can be cleared of radicals, &c., (as in examples 1, 2, 3, 4,) it may be ultimately reduced to the form $$x^n - p \cdot x^{n-1} + p' \cdot x^{n-2} \ldots \ldots = 0,$$ where $p, p', p''$, the coefficients, are either constants or functions of $v$. The index $n$ of this equation determines the number of the variables. Let $Y$ be a quantity containing the variable $y$, in the same manner that $X$ contains $x$, and let $Z$ contain $z$ in the same manner, and so on for the other roots. Then $v = X = Y = Z = \&c.$, in consequence of the hypothesis that $X$ is a symmetrical of this equation. Therefore the sum of the following series of differentials, $$X \cdot dx + Y \cdot dy + Z \cdot dz + \&c.,$$ is equal to $X \cdot (dx + dy + dz + \&c.)$ $$= X \cdot S dx = v S dx.$$ Now $S dx = dp$, where $p$ is the coefficient of the second term of the equation $$x^n - p \cdot x^{n-1} + p' \cdot x^{n-2} \ldots \ldots = 0;$$ and, as we have before remarked, $p$ is either a constant or a function of $v = \varphi \cdot v$. First let it be a constant: then $$dp = S dx = 0,$$ which gives \[ X \, dx + Y \, dy + Z \, dz + \ldots = v \, S \, dx \] \[ = 0, \] whence we deduce the very important consequence \[ \int X \, dx + \int Y \, dy + \int Z \, dz + \ldots = \text{const.}, \] which is true, whatever be the nature of the function \( X \), provided only that the coefficient \( p \) is constant. Secondly, let \( p = \phi v \), \[ \therefore dp = d \cdot \phi v, \] \[ \therefore X \, dx + Y \, dy + Z \, dz + \&c. = v \cdot d \phi v, \] \[ \therefore \int X \, dx + \int Y \, dy + \int Z \, dz + \&c. = \int v \cdot d \phi v; \] and therefore the sum of the integrals, or \( S \int X \, dx \), is known, whenever the formula \( \int v \, d \phi v \) is capable of integration; or, which is equivalent, when the form \( \int \phi v \cdot dv \) is capable of integration. These consequences flow, with respect to the proposed integral \( \int X \, dx \), from the supposition that \( X \) is a symmetrical of the equation whose roots the variables are. But a much more general method is attainable, by putting the proposed integral under the form \( \int \frac{X}{\psi x} \cdot \psi x \, dx \), and then assuming the quantity \( \frac{X}{\psi x} \) to be a symmetrical of the said equation*. Therefore \( \frac{X}{\psi x} \) may be represented by the general symbol \( v \), and the proposed integral by \( \int v \cdot \psi x \, dx \). The series of differentials \[ X \, dx + Y \, dy + Z \, dz + \&c. \] may therefore be written \[ v \psi x \, dx + v \psi y \, dy + v \psi z \, dz + \&c. \] \[ = v (\psi x \, dx + \psi y \, dy + \&c.), \] which we write \[ = v \, S \psi x \, dx. \] Therefore, whenever it happens that \( S \psi x \, dx = 0 \), we have \( v \, S \psi x \, dx = 0 \), \[ \therefore \int X \, dx + \int Y \, dy + \int Z \, dz + \&c. = \text{const.}: \] * Under this new supposition the quantity \( X \) of course ceases, in general, to be a symmetrical of the equation. and since $\psi x$ is arbitrary, this condition $S \psi x dx = 0$ may frequently be realized. But it will be observed, that every change in the form of $\psi x$ changes the equation between the variables. But if $S \psi x dx$ is not $= 0$, as it generally is not, it must be a quantity symmetrically composed with respect to all the variables, and therefore a function of $v$, since all the coefficients of the equation are functions of $v$, or constants. Therefore it may be represented by $d.\phi v$, $$\therefore X dx + Y dy + Z dz + &c. = v S \psi x dx$$ $$= v.d.\phi v$$ $$\therefore \int X dx + \int Y dy + &c. = \int v d.\phi v;$$ and therefore the sum of the integrals is known in all those cases in which $\int v d.\phi v$ can be integrated. The most direct and advantageous method of treating any proposed integral $\int X dx$, is to make one of the two suppositions above mentioned, viz. $X = v$, or $X = v.\psi x$. But the supposition $X = v + \psi x$, also, often leads to simple and satisfactory results. Our choice, however, is not limited to these forms, but may include others that are comprehended under the general formula $X = f(v,x)$, each of which may perhaps find its application in special cases. This process, in all its generality, constitutes the method which I now propose. The use and application of it will be best shown by examples. § 5. Direct integration of the formula $\int X . dx$, when that is possible, confirms and illustrates the above results, of which it will be convenient to adduce a few simple examples. Let there be two variables $x, y$, roots of $x^2 - vx + 1 = 0$. Therefore the quantity $\frac{1 + x^2}{x} = \frac{1 + y^2}{y} = v$ is a symmetrical of this equation. And we find $$x + y = v \quad \therefore dx + dy = dv \quad \ldots \ldots \ldots \ldots \ldots (1.)$$ $$xy = 1 \quad \therefore \frac{dx}{x} + \frac{dy}{y} = 0 \quad \ldots \ldots \ldots \ldots \ldots (2.)$$ $$x^2 + y^2 = v^2 - 2 \quad \therefore x dx + y dy = v dv; \quad \ldots \ldots \ldots (3.)$$ which results may be thus written: $$S dx = dv \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots \ldots \ldots (1.)$$ $$S \frac{dx}{x} = 0 \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots \ldots \ldots (2.)$$ $$S x dx = v dv. \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots \ldots \ldots (3.)$$ Multiply each of these equations by the equation \( \frac{1 + x^2}{x} = v \), and we find \[ \frac{1 + x^2}{x} S \, dx = v \, dv \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (1.) \] \[ \frac{1 + x^2}{x} S \frac{dx}{x} = 0 \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (2.) \] \[ \frac{1 + x^2}{x} S \, x \, dx = v^2 \, dv \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (3.) \] But because \( \frac{1 + x^2}{x} \) is a symmetrical, we may, according to the general principle, introduce it within the sign \( S \). \[ \therefore S \frac{1 + x^2}{x} \, dx = v \, dv \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (1.) \] \[ S \frac{1 + x^2}{x^2} \, dx = 0 \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (2.) \] \[ S (1 + x^2) \, dx = v^2 \, dv \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (3.) \] The integrals of these equations are \[ *S \int \frac{1 + x^2}{x} \, dx = \frac{v^2}{2} + \text{const.} \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (1.) \] \[ S \int \frac{1 + x^2}{x^2} \, dx = \text{const.} \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (2.) \] \[ S \int (1 + x^2) \, dx = \frac{v^3}{3} + \text{const.} \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (3.) \] And we propose to verify these three results by direct integration. First then we have \[ \int \frac{1 + x^2}{x} \, dx + \int \frac{1 + y^2}{y} \, dy = \frac{x^2 + y^2}{2} + \log x + \log y + \text{const.} \] But \[ xy = 1 \quad \therefore \log x + \log y = 0, \] and \[ x^2 + y^2 = v^2 - 2. \] \[ \therefore \int \frac{1 + x^2}{x} \, dx + \int \frac{1 + y^2}{y} \, dy = \frac{v^2}{2} + \text{const.} \quad \ldots \ldots \ldots \ldots \ldots \ldots \ldots (1.) \] Secondly we have * It is indifferent whether we write \( S \int \) or \( \int S \), and we may remark that the signs \( S \int d \) may often be permuted. Thus if there are two variables, \[ S \int dx = \int S \, dx = x + y \] \[ dS \, x = S \, dx = dx + dy \] \[ \frac{1}{2} dS \, x^2 = S \, dx^2 = x \, dx + y \, dy. \] \[ \int \frac{1 + x^2}{x^3} \, dx + \int \frac{1 + y^2}{y^2} \, dy = (x + y) - \left( \frac{1}{x} + \frac{1}{y} \right) + \text{const.} \] \[ = (x + y) - \left( \frac{x + y}{1} \right) + \text{const.} \] \[ = \text{const.} \] Thirdly we have \[ \int (1 + x^2) \, dx + \int (1 + y^2) \, dy = (x + y) + \frac{x^3 + y^3}{3} + \text{const.} \] \[ = v + \frac{x^3 + y^3}{3} + \text{const.} \] But the formula makes it \[ = \frac{v^3}{3} + \text{const.} \] It is necessary therefore to show that these two results are in accordance, or that \[ \frac{v^3}{3} + \text{const.} = v + \frac{x^3 + y^3}{3}, \] or that \[ x^3 + y^3 = v^3 - 3v + \text{const.} \] This may be shown by multiplying together the equations \[ x^2 + y^2 = v^2 - 2, \] \[ x + y = v, \] which gives \[ x^3 + y^3 + xy(x + y) = v^3 - 2v, \] and since \[ xy = 1, \text{ and } x + y = v \] \[ x^3 + y^3 = v^3 - 3v \] in accordance with the formula. I will now apply the method to another example, which conducts to a new and interesting property of the cubic parabola. Since \( \phi \frac{1 + x^2}{x} \) is a symmetrical of the equation \( x^2 - vx + 1 = 0 \), (as has been already remarked) we have, as a particular instance of this, \[ \sqrt{\frac{1 + x^2}{x}} = \text{a symmetrical} = \sqrt{v}. \] Multiply this equation by \( S \, dx = dv \), \[ \therefore \sqrt{\frac{1 + x^2}{x}} \, S \, dx = \sqrt{v} \, dv, \] or \[ S \sqrt{\frac{1 + x^2}{x}} \cdot dx = \sqrt{v} \, dv. \] The sign \( S \) being thus transposed because \( \sqrt{\frac{1 + x^2}{x}} \) is a symmetrical. The integral of this last equation is \[ \int \sqrt{\frac{1 + x^2}{x}} \, dx = \frac{2}{3} v^{\frac{3}{2}} + C, \] which means that \[ [1.] \int \sqrt{\frac{1 + x^2}{x}} \, dx + \int \sqrt{\frac{1 + y^2}{y}} \, dy = \frac{2}{3} (x + y)^{\frac{3}{2}} + C, \] provided that \( xy = 1 \). For since \( x, y \), are roots of \( x^2 - vx + 1 = 0 \), \( x + y = v \), a variable quantity, and \( xy = 1 \) is the only condition which the variables must satisfy. Now assume \( x = u^2, y = t^2 \), and the relation between the new variables will be \( u^2 t^2 = 1 \), or \( ut = 1 \); and equation (1.) becomes, when divided by 2, \[ \frac{1}{3} (u^2 + t^2)^{\frac{3}{2}} + C = \int \sqrt{1 + u^4} \, du + \int \sqrt{1 + t^4} \, dt, \] whence the following theorem. *If \( u, t \), two ordinates of the cubic parabola*, are reciprocals, so that \( ut = 1 \), then the sum of the two corresponding arcs of the curve \( = \frac{1}{3} (u^2 + t^2)^{\frac{3}{2}} + \text{const}. \) The reader may wish to see this result also verified by direct integration. Since then \( ut = 1 \), let us write \( \frac{1}{u} \) instead of \( t \) in the equation \[ \frac{1}{3} (u^2 + t^2)^{\frac{3}{2}} + C = \int \sqrt{1 + u^4} \, du + \int \sqrt{1 + t^4} \, dt, \] and it becomes \[ \frac{1}{3} \left( u^2 + \frac{1}{u^2} \right)^{\frac{3}{2}} + C = \int \sqrt{1 + u^4} \, du - \int \sqrt{1 + u^4} \cdot \frac{du}{u}, \] or \[ \frac{1}{3} \left( \frac{1 + u^4}{u^3} \right)^{\frac{3}{2}} + C = \int \sqrt{1 + u^4} \, du \left( 1 - \frac{1}{u^4} \right), \] which ought to be identically true, whatever be the value of \( u \). To see that it is so in reality, we have only to differentiate the first part of the equation, and we find its differential to be \[ 2 \sqrt{1 + u^4} \, du - \left( \frac{1 + u^4}{u^4} \right)^{\frac{3}{2}} \cdot du \] \[ = \sqrt{1 + u^4} \, du \left( 2 - \frac{1 + u^4}{u^4} \right) = \sqrt{1 + u^4} \, du \left( 1 - \frac{1}{u^4} \right), \] which is the differential of the second part of the equation. Let us now show the application of the method to formulæ containing cubic radicals. * The equation to the cubic parabola, whose coordinates are \( u, u' \), being \( 3u' = u^3 \), it follows that the arc \[ = \int du \sqrt{1 + u^4}. \] Resuming the former equations \[ x^2 - v x + 1 = 0, \quad v = \frac{1 + x^2}{x}, \] we have \[ \sqrt[3]{v} = \sqrt[3]{\frac{1 + x^2}{x}}. \] Multiply this equation by \( S \, dx (= dv) \) \[ \therefore \sqrt[3]{v} \cdot dv = \sqrt[3]{\frac{1 + x^2}{x}} \cdot S \, dx. \] But we may introduce \( \sqrt[3]{\frac{1 + x^2}{x}} \) (since it is a symmetrical) within the sign \( S \), \[ \therefore \sqrt[3]{v} \cdot dv = S \sqrt[3]{\frac{1 + x^2}{x}} \cdot dx, \] and integrating \[ \frac{3}{4} v^{\frac{4}{3}} + \text{const.} = S \int \sqrt[3]{\frac{1 + x^2}{x}} \cdot dx. \] It is plain that the sum of two integrals of the form \( \int \sqrt[3]{\frac{1 + x^2}{x}} \cdot dx \) may be found by a similar process, provided always that \( xy = 1 \). Resuming the last example we have \[ \sqrt[3]{v} = \sqrt[3]{\frac{1 + x^2}{x}}. \] If we multiply this equation by \( S \frac{dx}{x} \) instead of \( S \, dx \), we have \[ \sqrt[3]{v} \cdot S \frac{dx}{x} = \sqrt[3]{\frac{1 + x^2}{x}} S \frac{dx}{x} = S \sqrt[3]{\frac{1 + x^2}{x}} \cdot \frac{dx}{x} = S \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx. \] But \( S \frac{dx}{x} = 0 \) in this example*, \[ \therefore 0 = S \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx \] \[ \therefore \text{integrating we find the sum of two integrals of the form } \int \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx \text{ is a constant, if } xy = 1. \] Since nothing tends more to elucidate a subject than a frequent recurrence to first principles, I will remark that this result also follows at once from the supposition \( xy = 1 \). For if we write \( \frac{1}{x} \) for \( y \), \[ \int \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx + \int \sqrt[3]{\frac{1 + y^2}{y^4}} \cdot dy \] becomes \[ \int \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx - \int \sqrt[3]{x^4 + x^2} \cdot \frac{dx}{x^2} \] \[ = \int \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx - \int \sqrt[3]{\frac{1 + x^2}{x^4}} \cdot dx = 0 = \text{const.} \] * See page 200; or page 208, note. We are now in possession of principles which enable us to attack the general problem, "To find an algebraic relation between \( n \) variables \( x, y, z \ldots \), such that \[ \int \phi (X) \, dx + \int \phi (Y) \, dy + \int \phi (Z) \, dz + \&c. = \text{const.}, \] \( X \) being a polynomial of \( n \) dimensions with constant coefficients of the form \[ x^n - ax^{n-1} + bx^{n-2} + \&c. + hx + k, \] and containing at least two distinct powers of \( x \); and \( \phi \) being any function whatever of the said polynomial. It does not appear that any mathematician has hitherto proposed this problem. The principles of our method lead to the following solution: Let \( X \), or \( x^n - ax^{n-1} + \ldots + hx + k = v \), \( v \) being a variable quantity susceptible of any value*. \[ \therefore x^n - ax^{n-1} + \ldots + hx + (k - v) = 0. \] This equation has only one variable coefficient, viz. \((k - v)\). Therefore the values of its \( n \) roots depend upon \( v \), so far, at least, that when \( v \) changes its value, each root (generally speaking) undergoes a corresponding change. Also the sum of the roots \( x + y + z + \ldots = a \) is constant. \[ \therefore dx + dy + dz + \ldots = 0, \] or \[ S dx = 0. \] Since \( v = X \), \( \phi v = \phi X \). Multiply this equation by \( S dx = 0 \), \[ \therefore \phi v S dx = \phi X \cdot S dx \] \[ \therefore 0 = \phi X \cdot S dx = S \phi X \cdot dx \] (because \( \phi X \) is a symmetrical of this equation) \[ \therefore \text{const.} = S \int \phi X \cdot dx, \] which therefore is the required solution of the problem. **Example.**—Let \( \int \sqrt[3]{x^3 + x + 1} \cdot dx \) be the proposed integral. Assume \( x^3 + x + 1 = v \), \[ \therefore x^3 + x + (1 - v) = 0. \] * To suppose \( X = v \) is the same as to suppose \( X \) to be a symmetrical of the equation between the variables (as recommended at pages 198, 200). Whence also \( \phi X \) is a symmetrical of the same equation. The symbol \( v \) retains the same meaning as before, viz. that of a quantity independent of \( x \), or which continues to have the same value when \( x \) is permuted for any other root of the equation. I shall give it this meaning throughout the present memoir. The solution given in the text may be expressed in other words, by saying that any two of the variables, as for instance \( x \) and \( y \), are mutually connected by the equation \[ x^n - ax^{n-1} + \ldots + hx = y^n - ay^{n-1} + \ldots + hy, \] whence of course it follows that \( X \), or \( x^n - ax^{n-1} + \ldots + hx + k \), does not change its value when \( x \) is permuted for \( y \), and therefore it may properly be denoted by \( v \), according to the acceptation which we have hitherto given to that letter. Attribute to \( v \) any numerical value, and let the three roots of the equation then be \( m, m', m'' \). And when \( v \) has some other value, let the roots be \( n, n', n'' \). So that while \( v \) has changed progressively from one value to the other, the root \( m \) has progressively changed its value to \( n \), the root \( m' \) to \( n' \), and the root \( m'' \) to \( n'' \). These things being thus understood, the meaning of the theorem is, that the value of the integral \( \int d x \sqrt[3]{x^3 + x + 1} \) taken between the limits \( x = m, x = n \), \[ + \text{its value between the limits } m', n', + \text{its value between the limits } m'', n'', = \text{a constant}. \] If the question be viewed geometrically, since the roots of an equation are the intersections of a curve with its axis, a progressive change in the value of \((k - v)\), the absolute term, is equivalent to a displacement of the axis parallel to itself, in consequence of which all the intersections change their places simultaneously. In the case of two variables, we have simply \[ X = x^2 - ax + b = v, \] or \[ x^2 - ax + (b - v) = 0. \] And if \( x, y \), are the roots of this equation, the theorem becomes \[ \int \phi X \cdot dx + \int \phi Y \cdot dy = \text{const.}, \] \( \phi \) being any function. Now in this particular case the theorem admits of a very simple demonstration. For since \( x + y = a, y = a - x \); and substituting this value in \( Y = y^2 - ay + b \), it becomes \((a - x)^2 - a(a - x) + b = x^2 - ax + b \): also \( dy \) becomes \(-dx\). \[ \therefore \phi(y^2 - ay + b) \cdot dy \text{ becomes } -\phi(x^2 - ax + b) \cdot dx. \] Therefore \[ \phi(x^2 - ax + b) \cdot dx + \phi(y^2 - ay + b) \cdot dy = 0, \] or \[ \phi X \cdot dx + \phi Y \cdot dy = 0 \] \[ \therefore \int \phi X \cdot dx + \int \phi Y \cdot dy = \text{const.}, \] which was to be demonstrated. Let \( X = x^n - ax^{n-1} + \ldots \) as before, it may be shown upon the same principles that \( S \int \phi X \cdot x^m \cdot dx = \text{const.} \), provided \( S x^m \cdot dx = 0 \), or \( S x^{m+1} \) is constant, that is to say, does not contain \( v \); which depends on the relative values of \( m \) and \( n \). Also we may obtain in a similar manner the solution of the following problem, viz. \[ S \int \phi \left( \frac{X}{X'} \right) \cdot dx = \text{const.}, \] where \( X \) is a polynomial of \( n \) dimensions, and \( X' \) another polynomial of not more than \( n - 2 \) dimensions. For, putting \[ \frac{X}{X'} = v, \] \( x, y, \&c. \&c. \) are roots of \[ X - v X' = 0, \] which is of the form \[ x^n - a x^{n-1} + (b - v) x^{n-2} + \&c. = 0, \] where \[ S x = a = \text{const.}, \] and therefore \[ S d x = 0 \quad \therefore S \varphi \left( \frac{X}{X'} \right) d x = \varphi \left( \frac{X}{X'} \right) S d x = 0, \] \[ \therefore S \int \varphi \left( \frac{X}{X'} \right) d x = \text{const}. \] I will now add several examples, and I request the reader's attention to the directness with which their solutions are obtained by means of the foregoing principles. In the present paper I have avoided the use of transformations, except that of \( x = u^n \), because they are unnecessary to the success of the method, and that I am here considering general principles rather than individual results. § 6. Examples. Ex. 1. Let the proposed integral be \[ \int \frac{d x}{\sqrt{1 - x^3}}. \] This is Mr. Lubbock's first example in his paper on Abel's theorem in the Philosophical Magazine*. The result which he finds is equivalent to this, that if \( x \) and \( y \) satisfy the equation \[ xy - (x + y) = 2, \] then \[ \int \frac{d x}{\sqrt{1 - x^3}} + \int \frac{d y}{\sqrt{1 - y^3}} = \text{const}. \] For the sake of comparison I will take this as the first example of my method, and supposing its solution to be unknown, proceed to investigate it as follows: \[ \int \frac{d x}{\sqrt{1 - x^3}} \] may be put under the form \[ \int \frac{d x}{(1 - x) \sqrt{\frac{1 + x + x^3}{1 - x}}}. \] Put \[ \frac{1 + x + x^2}{1 - x} = v \] \[ \therefore x^2 + (1 + v) x + (1 - v) = 0, \] \( x \) and \( y \) must be the roots of this equation†, * Vol. vi. p. 118. † See the note in page 196. \[ \begin{align*} \therefore x + y &= -(1 + v) \quad xy = 1 - v \\ \therefore xy - (x + y) &= 2, \end{align*} \] which is the equation of condition found by Mr. Lubbock. Again, the sum of the integrals \[ \int \frac{dx}{\sqrt{1-x}} \cdot \frac{1}{v} + \int \frac{dy}{\sqrt{1-y}} \cdot \frac{1}{v} \] \[ = \int \frac{1}{v} \left( \frac{dx}{1-x} + \frac{dy}{1-y} \right) = \int 0 = \text{const.} \] because, since \[ xy - (x + y) = 2 \] \[ (1-x)(1-y) = 1 + xy - (x+y) = 3 \] \[ \therefore \log (1-x) + \log (1-y) = \log 3 \] \[ \therefore \frac{dx}{1-x} + \frac{dy}{1-y} = 0 \] \[ \therefore \text{sum of the integrals} = \text{const.} \quad Q. E. D. \] **Ex. 2.** To find the sum of three integrals of the same form. \[ \int \frac{dx}{\sqrt{1-x^3}} \text{ may be written } \int \frac{dx}{x} \sqrt{\frac{x^3}{1-x^3}}. \quad \text{Put } \frac{x^3}{1-x^3} = \frac{1}{v}, \] \[ \therefore x^3 + vx^2 - 1 = 0. \] The three variables \(x, y, z\), must be roots of this equation, \[ \therefore xy z = 1, \text{ and } xy + xz + yz = 0. \] Here we have \[ \sqrt{\frac{x^2}{1-x^3}} = \sqrt{\frac{1}{v}}. \] Multiply this by \(S \frac{dx}{x} = 0*\), \[ \therefore \sqrt{\frac{x^2}{1-x^3}} S \frac{dx}{x} = 0. \] But \[ \sqrt{\frac{x^2}{1-x^3}} S \frac{dx}{x} = S \sqrt{\frac{x^2}{1-x^3}} \cdot \frac{dx}{x} = S \frac{dx}{\sqrt{1-x^3}} \] \[ \therefore S \frac{dx}{\sqrt{1-x^3}} = 0 \quad \therefore S \int \frac{dx}{\sqrt{1-x^3}} = \text{const.} \] * In any equation whose last term is constant, \(S \frac{dx}{x} = 0\). For \[ S \frac{dx}{x} = \frac{dx}{x} + \frac{dy}{y} + \frac{dz}{z} + \&c. \] \[ = d \log x + d \log y + \&c. \] \[ = d \log (xyz \ldots) \] \[ = d \cdot \text{const.} = 0. \] ∴ the sum of the three integrals is constant, provided that \(xy + xz + yz = 0\), and that \(xyz = 1\). It will be observed that the solution is simpler in the case of three integrals than of two. **Ex. 3.** Supposing the relation between \(x, y, z\) the same as in the last example, to find the sum of three integrals of the form \(\int \frac{x \, dx}{\sqrt{1-x^3}}\). Since \[ \sqrt{\frac{1}{v}} = \frac{x}{\sqrt{1-x^3}} \] or in other words, since \(\frac{x}{\sqrt{1-x^3}}\) is a symmetrical of the equation \[ x^3 + v x^2 - 1 = 0, \] if it be multiplied by \(S \, dx\) the result will be \[ S \frac{x \, dx}{\sqrt{1-x^3}}. \] Also \[ S x = -v \quad \therefore S \, dx = -dv \] \[ \therefore S \frac{x \, dx}{\sqrt{1-x^3}} = \sqrt{\frac{1}{v}} S \, dx = \frac{-dv}{\sqrt{v}}. \] Therefore \[ S \int \frac{x \, dx}{\sqrt{1-x^3}} = \text{const.} - 2\sqrt{v}. \] **Ex. 4.** The same suppositions continuing, required the sum of three integrals of the form \(\int \frac{x^2 \, dx}{\sqrt{1-x^3}}\). As before, \[ \frac{x}{\sqrt{1-x^3}} = \sqrt{\frac{1}{v}}. \] Multiply by \(S \, x \, dx\), \[ \therefore S \frac{x^2 \, dx}{\sqrt{1-x^3}} = \sqrt{\frac{1}{v}} S \, x \, dx = dv \sqrt{v}, \] (because the equation \(x^3 + v x^2 - 1 = 0\) gives \(S x^2 = v^2 \quad \therefore S \, x \, dx = v \, dv\)) \[ \therefore S \int \frac{x^2 \, dx}{\sqrt{1-x^3}} = \frac{2}{3} v^{\frac{3}{2}} + \text{const.} \] But since \(\int \frac{x^2 \, dx}{\sqrt{1-x^3}}\) is a form which is readily integrable *per se*, it will naturally be asked whether the result of direct integration is the same as that given by our formulae. This example therefore affords a convenient opportunity of showing the close accordance between this branch of the integral calculus and the theory of algebraic equations. By direct integration, \[ S \int \frac{x^2 \, dx}{\sqrt{1-x^3}} = -\frac{2}{3} (\sqrt{1-x^3} + \sqrt{1-y^3} + \sqrt{1-z^3}) + \text{const.} \] And by our method, \[ S \int \frac{x^2 \, dx}{\sqrt{1 - x^3}} = \frac{2}{3} v^{\frac{3}{2}} + \text{const}. \] ∴ it remains to verify that \[ \sqrt{1 - x^3} + \sqrt{1 - y^3} + \sqrt{1 - z^3} = -v^{\frac{3}{2}}. \] In order to demonstrate this, let us resume the original equation \( x^3 + vx^2 - 1 = 0 \), which gives \( vx^2 = 1 - x^3 \), \[ ∴ \sqrt{v} \cdot x = \sqrt{1 - x^3}, \] and similarly, \[ \sqrt{v} \cdot y = \sqrt{1 - y^3} \] \[ \sqrt{v} \cdot z = \sqrt{1 - z^3} \] \[ ∴ \sqrt{1 - x^3} + \sqrt{1 - y^3} + \sqrt{1 - z^3} = \sqrt{v} (x + y + z) \] \[ = \sqrt{v} (-v) = -v^{\frac{3}{2}}. \quad Q.E.D. \] **Ex. 5.** \( \int \frac{dx}{\sqrt{(x^3 + x^2)} + \sqrt{x^3 + x^2}}. \) This is a function of the binomial \( x^3 + x^2 \), which being put \( = v \), we have \[ x^3 + x^2 - v = 0. \] The three roots of this equation are the variables that answer the problem. Putting \( \sqrt{v} + \sqrt[3]{v} = \phi v \), the sum of the three integrals becomes \[ \int \frac{dx}{\phi v} + \int \frac{dy}{\phi v} + \int \frac{dz}{\phi v} = \int \frac{1}{\phi v} S \, dx; \] but \( S \, dx = 0 \), because \( x + y + z = -1 \), \[ ∴ \text{sum of integrals} = \int 0 = \text{const}. \] **Ex. 6.** \( \int dx \sqrt{1 + x^n}. \) Here we cannot suppose \( \sqrt{1 + x^n} = v \) a symmetrical quantity, because that would amount to the supposition \( \sqrt{1 + x^n} = \sqrt{1 + y^n} = \sqrt{1 + z^n} = &c., \) which implies that \( x = y = z = &c., \) whereas we suppose the roots to be in general all different from one another. This is the reason why it was remarked before, that it was requisite the polynomial should contain at least two distinct powers of \( x \). When that is not the case, a second power of \( x \) must be introduced. There are several ways of doing this; the simplest is the following: Put \( \int dx \sqrt{1 + x^n} \) under the form \( \int x \, dx \sqrt{\frac{1 + x^n}{x^2}} \). Assume \( \frac{1 + x^n}{x^2} = v, \) \[ ∴ x^n - vx^2 + 1 = 0; \] an equation of \( n \) dimensions, of which \( v \) is the only variable coefficient. The \( n \) roots of this equation answer the problem. The sum of $n$ integrals becomes $$\int x \, dx \sqrt{v} + \int y \, dy \sqrt{v} + \text{&c.} = \int \sqrt{v} \cdot Sx \, dx.$$ If $n$ is a number greater than 4, $Sx^2 = 0$: $$\therefore Sx \, dx = 0 \quad \therefore \text{sum of } n \text{ integrals} = \text{const.}$$ If $n = 4$, $Sx^2 = 2v$, $\therefore Sx \, dx = dv$, $$\therefore \text{sum of four integrals} = \int \sqrt{v} \cdot dv = \frac{2}{3} v^{\frac{3}{2}} + C.$$ If $n = 3$, $Sx^2 = v^2$, $\therefore Sx \, dx = v \, dv$, $$\therefore \text{sum of three integrals} = \int v^{\frac{3}{2}} \, dv = \frac{2}{5} v^{\frac{5}{2}} + C.$$ But when $n$ is greater than 4, the equation has impossible roots, therefore the solution is imaginary. Although, as Legendre has demonstrated*, these imaginary cases do not cease to have a real analytical meaning; the sum of two imaginary integrals forming a real integral in a manner analogous to that in which two imaginary roots of an equation form a real sum and product. But we may avoid these imaginary solutions by putting $\int dx \sqrt{1 + x^n}$ in the form $$\int dx (a + bx + cx^2 \ldots \ldots) \sqrt{\frac{1 + x^n}{(a + bx + cx^2 \ldots \ldots)^2}}.$$ Assuming, then, $$\frac{1 + x^n}{(a + bx + cx^2 \ldots \ldots)^2} = v,$$ we may attribute to the polynomial any number of terms suitable to the exponent $n$†, and then it is in most cases possible to find such numerical values for the constant coefficients $a$, $b$, $c$, &c., that the resulting equation shall have all its roots real. Each integral, then, has the form $$\int dx (a + bx + cx^2 \ldots \ldots) \sqrt{v},$$ and the sum of all $$= \int \sqrt{v} \cdot Sx \, dx (a + bx + cx^2 \ldots \ldots),$$ where $Sx \, dx (a + bx + cx^2 \ldots \ldots)$ = the aggregate of the partial sums $$aSx \, dx + bSx \, dx + cSx^2 \, dx + \ldots \ldots,$$ which is the differential of $$aSx + \frac{b}{2} Sx^2 + \frac{c}{3} Sx^3 + \ldots \ldots,$$ and may therefore be expressed in terms of $v$, since the quantities $Sx$, $Sx^2$, $Sx^3$, &c. are readily found in terms of $v$ by the usual doctrine of algebraic equations. * Fonctions Elliptiques, vol. iii. p. 326. † In general the number of its terms may be $\frac{n}{2}$ or $\frac{n+1}{2}$. Ex. 7. \( \int dx \sqrt[3]{1 + x^n} \). This may be put in the form \[ \int dx (a + bx + \ldots) \sqrt[3]{\frac{1 + x^n}{(a + bx + \ldots)^3}}; \] and putting \[ \frac{1 + x^n}{(a + bx + \ldots)^3} = v, \] the reasoning is the same nearly as in the preceding case. The same principles are applicable to the more general integral \( \int dx \sqrt[m]{1 + x^n} \), \( m \) being a whole number. These solutions give the algebraic sum of \( n \) integrals of the proposed form. But this number \( n \) may be reduced by various methods to a lower number, which is the minimum that the problem admits of: ex. gr. the lowest number of integrals of the form \( \int dx \sqrt{1 + x^i} \) which have an algebraic sum is two; of the form \( \int dx \sqrt{1 + x^5} \) is three; of the form \( \int dx \sqrt{1 + x^{10}} \) is likewise three, &c. &c., which subject I shall treat of in a subsequent section. Ex. 8. \( \int \frac{dx}{\sqrt[3]{x^3 - 1}} \). First solution. Put \( x^3 = t \), and the integral becomes \[ \frac{1}{3} \frac{t^{-\frac{2}{3}} dt}{\sqrt[3]{t^2 - 1}} = \frac{1}{3} \cdot \frac{dt}{\sqrt[3]{t^3 - t^2}}. \] Put \( t^3 - t^2 = v \), or \( t^3 - t^2 - v = 0 \). The three roots of this equation answer the problem. \[ \therefore \text{the sum of three integrals } = \int \frac{1}{3} S \frac{dt}{\sqrt[3]{v}} = \int 0 = \text{const.} \] (because \( S t = 1 \), being the coefficient of the second term of the equation \( t^3 - t^2 - v = 0 \) taken negatively, whence \( S dt = 0 \)). Second solution. Put \( x^3 = t^2 \), and the integral becomes \[ \frac{2}{3} \frac{t^{-\frac{1}{3}} dt}{\sqrt[3]{t^2 - 1}} = \frac{2}{3} \cdot \frac{dt}{\sqrt[3]{t^3 - t}}. \] Put \( t^3 - t = v \), \[ \therefore t^3 - t - v = 0, \] and the roots of this equation answer the problem. \[ \therefore \text{the sum of three integrals } = \int \frac{2}{3} S \frac{dt}{\sqrt[3]{v}} = \int 0 = \text{const.} \] (because \( S t = 0 \) in the equation \( t^3 - t - v = 0 \) \( \therefore S dt = 0 \)), and the sum of the integrals therefore reduces itself in this case also to a constant. It will probably be satisfactory to the reader to see some one of these results verified by arithmetical computation. Let us therefore select this last example for that purpose. § 7. Example of an arithmetical calculation of the sum of three Integrals. The preceding analysis shows that the sum of the three integrals \[ \int_{\sqrt[3]{x^3 - 1}} \frac{dx}{\sqrt[3]{x^3 - 1}} + \int_{\sqrt[3]{y^3 - 1}} \frac{dy}{\sqrt[3]{y^3 - 1}} + \int_{\sqrt[3]{z^3 - 1}} \frac{dz}{\sqrt[3]{z^3 - 1}} = \text{const}. \] if \( x^{\frac{2}{3}}, y^{\frac{2}{3}}, z^{\frac{2}{3}} \) are roots of the equation \[ t^3 - t - v = 0. \] But the form of this equation shows that the sum of its roots \(= 0\), the sum of the products of every two roots \(= -1\), while the product of all the roots is a variable quantity \(= v\); \(∴\) the quantities \(x, y, z\) must satisfy the two following equations, \[ x^{\frac{2}{3}} + y^{\frac{2}{3}} + z^{\frac{2}{3}} = 0 \] \[ (xy)^{\frac{2}{3}} + (xz)^{\frac{2}{3}} + (yz)^{\frac{2}{3}} = -1. \] And if they do so, we shall have \[ \int_x + \int_y + \int_z = \text{const}, \] denoting by \( \int_x \) the integral \( \int_{\sqrt[3]{x^3 - 1}} \). But in order to eliminate the constant, we may take three other variables \(x', y', z'\), satisfying the same two equations of condition, and thence deduce \[ \int_{x'} + \int_{y'} + \int_{z'} = \text{const}. \] Whence by subtraction we eliminate the constant \[ (\int_x - \int_{x'}) + (\int_y - \int_{y'}) + (\int_z - \int_{z'}) = 0. \quad \ldots \quad [1.] \] Now by the usual methods we find that the equations of condition are satisfied by the values \[ x = .352342 \] \[ y = .917532 \] \[ z = 1.057860, \] and also by the values \[ x' = .392456 \] \[ y' = .900227 \] \[ z' = 1.065602*. \] * These values give \[ x^{\frac{2}{3}} = -0.209149 \] \[ y^{\frac{2}{3}} = -0.878885 \] \[ z^{\frac{2}{3}} = 1.088034 \] \[ x'^{\frac{2}{3}} = -0.245862 \] \[ y'^{\frac{2}{3}} = -0.854138 \] \[ z'^{\frac{2}{3}} = 1.100000 \] Sum \(= 0\) \quad Sum \(= 0\) This verifies the first equation of condition. The squares of these quantities are It remains therefore to try by actual calculation whether these values satisfy the equation [1.]. \[ \int_x = \int_{\sqrt[3]{x^3 - 1}} dx = x + \frac{1}{3} \cdot \frac{x^4}{4} + \frac{1.4}{3.6} \cdot \frac{x^7}{7} + \frac{1.4.7}{3.6.9} \cdot \frac{x^{10}}{10} + &c. \] \[ \int_{x'} = \int_{\sqrt[3]{x'^3 - 1}} dx' = x' + \frac{1}{3} \cdot \frac{x'^4}{4} + \frac{1.4}{3.6} \cdot \frac{x'^7}{7} + \frac{1.4.7}{3.6.9} \cdot \frac{x'^{10}}{10} + &c. \] ∴ putting \( x' - x = \Delta x, x'^4 - x^4 = \Delta (x^4) \) &c. \[ \int_{x'} - \int_x = \Delta x + \frac{1}{3} \cdot \frac{\Delta (x^4)}{4} + \frac{1.4}{3.6} \cdot \frac{\Delta (x^7)}{7} + \frac{1.4.7}{3.6.9} \cdot \frac{\Delta (x^{10})}{10} + &c. \] and since \( \Delta x = .04011 \) is a small quantity, we readily find the sum of the series \( = .04083 \). Treating the other variables in the same manner, the result obtained is \[ X = \int_{x'} - \int_x = .040834 \] \[ Y = \int_{y'} - \int_y = .027526 \] \[ Z = \int_{z'} - \int_z = .013315. \] With regard to the signs, it appears that the integral \( X \) has a sign opposed to that of the other two. We find therefore finally, \[ Y + Z = .040841 \] \[ X = .040834 \] ∴ \( Y + Z - X = .000007. \) On the other hand the formula gives \( Y + Z - X = 0 \), rigorously. Therefore the computation is only in error in the sixth place of decimals, which in consequence of the prolixity of these calculations may be considered to be a sufficient trial of its accuracy. \[ x^3 = .043743 \quad x'^3 = .060448 \] \[ y^3 = .772439 \quad y'^3 = .729552 \] \[ z^3 = 1.183818 \quad z'^3 = 1.210000 \] Sum = 2 \quad Sum = 2 Squaring the equation \( x^3 + y^3 + z^3 = 0 \), we have \[ (x^3 + y^3 + z^3) + 2 (\overline{x y}^3 + \overline{x z}^3 + \overline{y z}^3) = 0, \] and substituting the value just found of \( x^3 + y^3 + z^3 = 2 \), we have \[ \overline{x y}^3 + \overline{x z}^3 + \overline{y z}^3 = -1, \] which verifies the second equation of condition. Note.—The integrals comprised in the formula $\int \frac{P dx}{\sqrt{R}}$ have been called ultra-elliptic by Legendre. I think I have sufficiently shown that no line of distinction can be drawn between them and integrals in general; all of which, that are functions of a given polynomial, possess the property which was supposed to characterize the ultra-elliptic class.