Using the integral test, how do you show whether #sum 1/(n*(n))# diverges or converges from n=1 to infinity?

2 Answers
Mar 25, 2018

Converges.

Explanation:

Let's first ensure #f(n)=1/(n*n)=1/n^2# is positive, decreasing, and continuous on #[1, oo).# This is the case; as #n->oo, 1/n^2# gets smaller and smaller, but never takes on a negative value. Moreover, we avoid #n=0,# so it's continuous. So, we can indeed use the Integral Test.

Let #f(x)=1/x^2#

Now, take #int_1^oo1/x^2dx=lim_(t->oo)int_1^t1/x^2dx=lim_(t->oo)(-1/x)|_1^t#

This gives

#lim_(t->oo)(-1/t+1)=1#

So, since #int_1^oo1/x^2dx# converges (has a finite value), #sum_(n=1)^oo1/n^2# also converges (but we cannot determine its value just from the performed Integral Test).

Mar 25, 2018

#sum_(n=1)^oo1/n^2# converges

Explanation:

According to the integral test, if

#int_1^oo1/x^2dx#

converges, then

#sum_(n=1)^oo1/n^2#

must also converge.

Let's compute the integral:

#int_1^oo1/x^2dx=lim_(t->oo)int_1^tx^-2dx#

#rArrlim_(t->oo)[x^-1/-1]_1^t#

#rArrlim_(t->oo)[(-1/t)-(-1/1)]#

#rArr[0-(-1)]=1#

The integral converges, so according to the integral test, the series must also converge!