How do you show whether the improper integral int 1/ (1+x^2) dx11+x2dx converges or diverges from negative infinity to infinity?

1 Answer
Oct 24, 2015

I would prove that it converges by evaluating it.

Explanation:

int 1/(1+x^2) dx = tan^-1x +C11+x2dx=tan1x+C

If you don't know, or have forgotten the "formula", then use a trigonometric substitution:

x = tan thetax=tanθ gives us dx = sec^2 theta d thetadx=sec2θdθ and the integral becomes

int 1/(1+tan^2theta) sec^2theta d theta = int sec^2theta/sec^2theta d theta = int d theta = theta +C= tan^-1 x +C11+tan2θsec2θdθ=sec2θsec2θdθ=dθ=θ+C=tan1x+C

Recall that lim_(xrarroo)tan^-1x = pi/2 and lim_(xrarr-oo)tan^-1x = -pi/2

We need to split the integral. 0 is usually easy to work with, so let's use it.

int_-oo^oo 1/(1+x^2) dx = lim_(ararr-oo) int_a^0 1/(1+x^2) dx + lim_(brarroo) int_0^oo 1/(1+x^2) dx

= lim_(ararr-oo) [tan^-1 x]_a^0 + lim_(brarroo) [tan^-1 x]_0^b

= lim_(ararr-oo)[tan^-1(0) - tan^-1(b)] + lim_(brarroo)[tan^-1b-tan^-1 0]

= lim_(ararr-oo)[0 - tan^-1(b)] + lim_(brarroo)[tan^-1b-0]

= [-(-pi/2)]+[pi/2] = pi

The integral converges.