Is it possible to for an integral in the form #int_a^oo f(x)\ dx#, and #lim_(x->oo)f(x)!=0#, to still be convergent?

If you view the integral as the area under the curve, it seems logical that there is no way that the integral
#int_a^oo f(x)\ dx#
would converge unless #f(x)# eventually tends to zero
#lim_(x->oo)f(x)=0#
since the area under the graph wouldn't be bounded otherwise.

My question is, are there integrals where this is not the case? Where the limit of the function doesn't go to zero, but the integral is still convergent? What would be an example of such function?

1 Answer
Jan 5, 2018

If the limit #lim_(x->oo) f(x) = L# exists, then #L=0# is a necessary (but not sufficient) condition for the integral to converge.

In fact suppose #L > 0#: for the permanence of the sign, we can find a number #epsilon > 0# and a number #M# such that:

#f(x) >= epsilon # for #x > M#

So:

#int_a^t f(x) dx = int_a^M f(t)dt + int_M^t f(x)dx#

and based on a well known inequality:

#int_a^t f(x) dx >= I_0 +epsilon(t-M)#

which clearly diverges for #t->oo#.

If #L < 0# we can apply the same to #-f(x)#

For the same reason, also if #f(x) > 0# or #f(x) < 0# for #x >= M# then #lim_(x->oo) f(x) = 0# is a necessary condition.

However, if #lim_(x->oo) f(x)# does not exist and the function does not have a definite sign around #+oo# the condition is not necessary.

Can't find a counterexample right now, though.