If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

## AP®︎ Calculus BC (2017 edition)

### Unit 9: Lesson 6

Improper integrals

# Divergent improper integral

AP.CALC:
LIM‑6 (EU)
,
LIM‑6.A (LO)
,
LIM‑6.A.1 (EK)
,
LIM‑6.A.2 (EK)
Sometimes the value of an infinite integral is, well, infinite. Created by Sal Khan.

## Want to join the conversation?

• So, f(x)= x^(-1) has an infinite area from x=1 to infinity, but f(x)=x^(-2) has a FINITE area from x=1 to infinity.
What I'm wondering is: is there any exponent between -1 and -2 that is the 'divider' between the divergent and the finite area under the curve? That is, x raised to anything less than that exponent is of finite are (between x=1 and infinity), and x raised to anything greater (closer to x^(-1)) is infinite/ divergent.
• I'm not a math teacher, but from working out a few problems, it seems like the dividing line is x^(-1) itself.

At x^(-1), of course, the integral becomes a natural log, as Sal just solved for above.

If the exponent of x is less than -1, then the integral of the original expression will be some constant multiplied by x^( a negative number). When we evaluate the limit, the lower bound (1) produces some constant, but the other term, when we substitute n for x, now has n in the denominator. As n goes to infinity, this term disappears because 1/infinity is zero (well, not really, but close enough), leaving us with a constant, and thus, a finite solution.

However, if the exponent is greater than -1, when we integrate the original expression we will get a constant times x^(a positive number). When we substitue n for x and n goes to infinity, this term goes to infinity, giving us an infinite solution.

Edited for clarity.
• Very interesting 1/x^2 has an area of 1
while 1/x has an area of infinity.

I can't seem to grasp this weird thing.
• Maybe think about it this way: 1/x decreases in proportion with x as x grows, so it's always adding in proportion to the amount it's decreasing by.
1/x^2 decreases in proportion to the SQUARE of x as x grows, so the proportion added decreases as x get larger.

Just to mess with you a little more: 1/n^1.000001 also has a finite area.
• It doesn't make any sense to me that 1/x^2 would have a finite area, while 1/x would have an infinite area. 1/x^2 is essentially just taking 1/x and stretching it differently. I get that they might not have the same area, but it doesn't seem logical that one would have a finite area and the other wouldn't.
• This is a natural reaction, I think. Maybe you could convince yourself by studying the behaviour of the series Σ(n→∞) 1/n and the series Σ(n→∞) 1/n² and by understanding the proofs for the fact that the first one diverges and the second one converges. You could do this by calculating ∫(0→∞) 1/xⁿ. This wil result in two situations: for n ≤ 1 this thing diverges, and for n > 1 this thing converges. And by the way -this will maybe help you in your search- they are called, respectively, the (divergent) harmonic series and the (convergent) hyperharmonic series. Also: http://www.youtube.com/watch?v=aKl7Gwh297c
• How did 1/x become ln |x|?
• You mean as opposed to ln x ?

The domain of ln x is x > 0. You can't take the logarithm of something that is <= 0.
• Give an example of a convergent improper integral
• An elementary example would be

`∫ [1, ∞) 1/x² dx,`

where `∫ [1, ∞)` means integral over `[1, ∞)`. More generally,

`∫ [1, ∞) 1/xᵃ dx`

converges whenever `a > 1` and diverges whenever `a ≤ 1`. These integrals are frequently used in practice, especially in the comparison and limit comparison tests for improper integrals.

A more exotic result is

`∫ (-∞, ∞) xsin(x)/(x² + a²) dx = π/eᵃ,`

which holds for all `a > 0`. In particular,

`∫ [0, ∞) xsin(x)/(1 + x²) dx = π/(2e).`

Yet another example are the Fresnel integrals

`∫ [0, ∞) sin(x²) dx = ∫ [0, ∞) cos(x²) dx = (1/4)√(2π).`
• I very much appreciate Arni Häcki's answer to José Guilherme Licio's question, but even though it gives an illuminating intuition as to why ln(x) doesn't get to a "limit" as x approaches infinity, their answer does not address another point. What exactly is wrong with the following argument?

Since the derivative of ln(x), that is, 1/x, has a limit as x approaches ∞ (and this limit is 0), why can't we say that ln(x) "stops growing" at some point, in essentially the same way we say that "f(x) = 1 - 1/x" stops growing as x approaches ∞? We say f(x) "stops growing" precisely because the limit of 1/x as x approaches ∞ is 0. But using this same argument regarding the derivative of ln(x) doesn't seem to work. What is, precisely, the difference, and why does this difference matter?

Thank you!
• There is something that always confuses me: the natural logarithm function "grows" less and less for big values of x (as seen at on the video)... But why, then, doesn't it have a limit value, since its derivative seems to be zero at infinity?
• I want to continue from "Arni Hacki's question".

Here, look at the plot of e^x:
http://www.wolframalpha.com/input/?i=plot+e%5Ex

Does it look looks as if this function diverges so fast that at some value, it becomes parallel to the y-axis and is not defined for any x after this value? Well, this would be equivalent to ln converging to some y. It is just that our minds are tuned to seeing functions converge if the rate of change keeps decreasing but not to functions growing to infinity at a certain point and never coming down again.
• Hi everyone,

Throughout the course of the episode, I understand how and why integrating the function `1/x` over the interval `x = 1` to `x = infinity` works, but I'm confused as to how this idea agrees with the Comparison Theorem, especially pertaining to one of the principles of the theorem saying that if g(x) (i.e. `1/x`) is divergent from `x = 1` to `x = infinity`, then f(x) (i.e. `1/x^3`) is also divergent.

In other words, what causes f(x) to diverge if g(x) diverges?

Thank you all,
Aviel Rodriguez
• 1/x^3 is convergent

Using the theorem can't tell you if 1/x^3 converges or diverges when using 1/x. 1/x^3 < 1/x, but since 1/x doesn't converge, we don't know if 1/x^3 does. You need to find a function less than the original, and the original also has to converge.

For instance if you used 1/x^2, since that converges and is also greater than 1/x^3 we can conclude 1/x^3 also converges.

You would need another test to conclude 1/x^2 converges initially though.

Now, if a function f(x) diverges and then another function g(x) > f(x) the comparison test tells us that this larger g(x) also diverges.
• Why did we use n approaches infinity and not just x approaches to infinity? I mean it just seems like an extra step that is later undone.
• It's not 𝑥 that approaches infinity, but the upper bound on 𝑥.
That's why Sal uses another variable.