Integration and differentiation method for power series summation

From Calculus

Description of the method

This is a general method used for summing up power series. The general idea is to consider a power series of the form:

Instead of trying to sum it up directly, we try either of these:

  • Differentiate the power series, find the sum of that, and then integrate the function obtained, choosing the antiderivative whose value at 0 equals the constant term of the power series; OR
  • Integrate the power series, find the sum of that, and then differentiate the function obtained,


Examples

For these examples, we do not worry for the moment about the interval of validity, but simply try to compute the sums formally.

Summing up the derivative power series

Power series Derivative power series Function that the derivative power series converges to Antiderivative of that function whose value at 0 is the constant term of the power series = Our answer
, also called the ERF of (it is not among the elementary functions usually seen in calculus.
(geometric series)
We would need to integrate .

Summing up the antiderivative power series

Power series Antiderivative power series (we fix the constant term based on our convenience, it could be anything) Function that the antiderivative power series converges to Derivative of that function = Our answer

Interval of validity

We will assume that the summation formula that we have is valid on the entire interval of convergence of the power series.

The radius of convergence is not affected by differentiation or integration, i.e., the radius of convergence of a power series is the same as that of its derivative power series and antiderivative power series.

However, the endpoints can change as follows:

  • Going from a function to its derivative, the interval of convergence cannot get bigger, but it may get smaller, In other words, we may lose endpoints, but we cannot gain endpoints.
  • Going from a function to its antiderivative, the interval of convergence may get bigger, but it cannot get smaller. In other words, we may gain endpoints, but we cannot lose endpoints.

How to reason about the endpoints

We reason as follows:

  • First, we use the alternating series theorem, either directly or indirectly through the various rules for determining interval of convergence, to determine whether the endpoints are included in the interval of convergence of the power series we want to sum.
  • For those endpoints that are included, we try to use Abel's theorem to establish that the function that the power series converges to at the point is the same as the function it converges to in the interior of the interval of convergence.

Example of arctan

For further information, refer: arc tangent function#Power series

Consider the series:

We differentiate this to get:

The derivative power series converges to and the interval of convergence is (this is on account of it being a geometric series; we could also use the degree difference test to show that there is no convergence at the endpoints).

In other words:

The sum of the original power series must thus be a function that takes the value 0 at 0 (because that's the constant term of the power series) and has derivative . The only such function is , so:

However, we now have to consider the possibility of whether endpoints get included. To check this, we plug in the endpoint into the power series. We get a series that converges by the alternating series theorem:

Similarly, plugging in also gives a series that converges by the alternating series theorem:

Thus, the power series converges at both endpoints. We also know that is defined, not just on , but continuously on all of . In particular, the definition on extends continuously to the endpoints. Thus, by Abel's theorem on convergence of power series, the power series must converge to the function at the endpoints, and we get:

Thus, the overall conclusion is:

and the series does not converge outside .

Additional note: The degree difference test tells us that, in fact, convergence at both endpoints is conditional. On the other hand, convergence on the interior is absolute.

Example of natural logarithm

Consider the series:

Put , making the sum:

Differentiate and get:

The sum is and the interval of convergence is , so we have:

The sum of the original power series is the antiderivative of whose value at 0 is 0, the constant term of the power series. This is and we get:

The definition on the right side extends continuously at though not at . Further, the expression on the left side has interval of convergence , with the endpoint 1 included by the alternating series theorem. Thus, by Abel's theorem on convergence of power series, the series converges to the function at :

So, we get:

Additional note: The degree difference test tells us that convergence at 1 is conditional, but convergence on is absolute.