How do you find a Power Series solution of a linear differential equation?

1 Answer

To find a solution of a linear ordinary differential equation

a_n (x) y^((n)) + a_(n-1) y^((n-1)) cdots + a_1 (x) y^prime + a_0 (x) y = 0

around the point x_0, we must first evaluate what power series method we should use.

  • If the point x_0 is an ordinary point for the differential equation, that is, all a_i(x) are analytic around x_0 (their Taylor Series around x_0 has a non zero convergrence radius), then we can use the ordinary power series method, described below.

  • If the point x_0 is a regular singular point for the differential equation, that is, x^i a_i(x) are analytic around x_0, then we should use the Frobenius method (which will not be described in detail, due to it being more complicated).

  • If the point x_0 is an irregular singular point, nothing can be said about the solutions of the differential equation.

For the ordinary power series method, start by assuming the solution of the differential equation to be of the form

y(x) = sum_(k=0)^oo c_k (x-x_0)^k

Compute the ith derivative of y(x):

y^((i))(x) = sum_(k=0)^oo c_k (k-i+1) (k-i+2) cdots k (x-x_0)^(k-i) = sum_(k=i)^oo c_k (k-i+1) (k-i+2) cdots k (x-x_0)^(k-i)

Applying the computed derivatives to the differential equation should give a recurrence relation for the coefficients c_k.

Solving that recurrence relation should give at least one solution for the differential equation that lies in the interval (x_0-R, x_0+R), where R is the convergence radius of the power series solution found.

This method is generally used for differential equations with polynomial coefficients (that is, a_i(x) are polynomials). For ODEs with non-polynomial coefficients, it would be necessary to expand the coeffucient functions a_i(x) into their Taylor series around x_0 and multiply them by the Taylor series for y^((i)) (x), wich takes considerable effort.

A simple example (generally solved by more elementary methods) to illustrate the recurence relations that appear for the coefficients c_k:

Finding the solution around x_0=0 of the ODE:

y^prime (x) - y(x) = 0

Computing the derivatives and applying them to the DE, we get:

sum_(k=1)^(oo) k c_k x^(k-1) - sum_(k=0)^(oo) c_k x^k = 0

Changing the index of the first sum using the relation j=k-1 we get:

sum_(j=0)^(oo) (j+1) c_(j+1) x^j - sum_(k=0)^(oo) c_k x^k = 0

Since k and j are just indexes, we can, using new indexes l rearange the equation above as:

sum_(l=0)^(oo)[(l+1) c_(l+1) - c_l]x^l=0

Which is vallid if and only if

(l+1) c_(l+1) - c_l = 0 iff c_(l+1) = c_l/(l+1)

For the ith coefficient c_i:

c_i = c_(i-1)/(i) = c_(i-2)/(i (i-1)) = cdots = c_0/(i!)

Therefore:

y(x) = c_0 sum_(l=0)^(oo) 1/(l!) x^l

and

y(x) = c_0 e^x,

wich is the well known solution for this problem.