Main content

### Course: AP®︎/College Calculus BC > Unit 10

Lesson 15: Representing functions as power series- Integrating power series
- Differentiating power series
- Integrate & differentiate power series
- Finding function from power series by integrating
- Integrals & derivatives of functions with known power series
- Interval of convergence for derivative and integral

© 2024 Khan AcademyTerms of usePrivacy PolicyCookie Notice

# Interval of convergence for derivative and integral

Integrating or differentiating a power series term-by-term can only work within the interval of convergence. The interval of convergence of the integral/derivative will be the same, except maybe for the endpoints. See an example here.

## Want to join the conversation?

- I'm a bit confused at6:36.

I understand how (1)^n-1 will diverge as it simply 1+1+1+1+1.

But for (-1)^n-1: That comes out to 1+ (-1) + 1 + (-1) + 1 + (-1) + (1) + (-1) + (1) + ...

If we re-express that as: 1+ [(-1) + 1] + [(-1) + 1] + [(-1) + (1)] + [(-1) + (1)] + ...

Shouldn't the remaining values cancel out leaving the series to converge to 1?

Am I approaching this incorrectly?(6 votes)- Why did you choose to cancel out that way?

Why not (1 - 1) + (1 - 1) + (1 - 1) + ... = 0 ?

Or why not this?

S = 1 - 1 + 1 - 1 + ...

+ S : 1 - 1 + - 1 + ...

=> 2S = 1

So S = 1/2

Or countless other re-arrangements?

You're not "approaching this incorrectly", you are making an incorrect assumption. Namely that infinite sums behave the same way as finite ones. Specifically with regard to the associativity of addition; we can no longer put the parentheses where we like and necessarily get the same result.

If we take the partial sums of (-1)^(n-1) we get

1, 0, 1, 0, 1...

We say a series is convergent on a value if having got within a certain "neighbourhood" of that value, it never goes outside that neighbourhood again. And that for any given neighbourhood there are only a finite number of terms in it.

So your series doesn't converge on 1, because its first term is 1, but the next term (0) is further away.

Nor does it converge on zero, for essentially the same reason.

We can't even say it converges on 1/2, since even after an infinite number of terms it would still be in the neighbourhood [0,1].(15 votes)

- at1:24, why "you can only integrate or differentiate a power series terms-by-term for x values Within the Interval of Convergence for the power series"?(3 votes)
- There's an interesting reason for it.

I'll take a simpler function though. Say I have 1/(1-x) and I want to find the power series for it. You can see that this can be the closed form of a geometric series with a = 1 and r = x. So, we get a power series of (sum from n = 0 to infinity of x^(n)). You can verify for yourself that this is true, but see the equality which emerges here:

Sum from n = 0 to infinity of x^(n) = 1/(1-x)

Now, this isn't always true. It is only true when |x|<1. Unless the common ratio is less than 1, a series cannot converge and hence, the power series we got on the left cannot have the closed form on the right. So, taking the derivative/integral wouldn't make sense when the equality itself doesn't hold. That's why x strictly needs to be within the interval of convergence.(4 votes)

- How do we actually check the integral one for when x=1 and x=-1? I couldn't find a familiar pattern like in the derivative one. I found that when x=1 it's Σ1/(n+1)n how do we chack for convergence in here?(2 votes)
- What do you mean when you say "we can only do this within the interval of convergence" at1:23? I'm kind of lost as to why this is true. If the series diverges, we can't take derivatives or antiderivates of them?(2 votes)
- For the interval of convergence for the integral at the endpoints, are we really looking at a p series? Or do you actually use the comparison test with 1/p^2? I just want to make sure I'm not making the problem too difficult. :)(2 votes)
- I think of it as a comparison test, against the infinite sum of the terms 1/n^p, where, if p>1, then it converges. Notice that I've put the place holder "p" in the exponent of the denominator, where you've placed it as the base, with power 2 (which would converge), in the denominator.(1 vote)

- 1. at6:46, what does "series centered at zero" have anything to do with "the radius of convergence is the same"?

2. around6:56, does Sal mean the radius of convergence for a power series and its derivative and integral are the same? Thanks.(1 vote)- 1. Nothing, really. They are two independent facts. The series he used is centered at 0 (if it was centered at, say, x = a, it'd be written as [(x-a)^(n)]/n) and the radius of convergence of said series and its derivative is the same. Now, even if I took a series centered at some a, that series and the derivative would still have the same radius of convergence (only here, instead of the interval being from -R to R (where R is the radius of convergence), it'll be from a - R to a + R)

2. Yeah. Differentiating/integrating a power series doesn't change its radius of convergence. However, the interval of convergence may change (As shown by Sal, the series converges at x = -1 and not at x = 1, but the derivative of the series isn't)(2 votes)

- At6:46Sal says series centered at 0. Can any series have a center? I thought only Maclaurin and therefore Taylor had that. Also can you explain to me what does having a center mean?(1 vote)
- When doing the ratio test for the integral why hasn't he included the n power beside the (n+1) in the integral function? This would cancel out with the n denominator of the original function and give the limit to equal zero as it would be x/n+1.(1 vote)
- Is this a common trait? is it always the case the the derivative has one less = on the left side and the integral has one more = on the right side? or is it just coincidence?(1 vote)
- Hello, I understand the reason of why (Sigma=)S(x^n-1) diverge when x=-1, but intuitively each term cancell out the precedent and the sum should be equal to 0. I found a demonstration (which should be wrong but I don't see why) :

S(n=1;infinity)(x^n-1)=x^0+x^1+x^2+x^3... = S(n=0;infinity) (x^(2n) + x^(2n+1)) , when x=-1, S(n=0;infinity) ((-1)^(2n) + (-1)^(2n+1)) = S(n=0;infinity) (1-1) = 0.

If you could help me I would be grateful. Thanks.(1 vote)- I understand the reason of why (Sigma=)S(x^n-1) diverge when x=-1, but intuitively each term cancell out the precedent and the sum should be equal to 0.

So you just use definition of convergence. There is formal definition you can refer to or you can use more intuitive definition.(1 vote)

## Video transcript

- [Instructor] Times in our
dealings with powers series. We might wanna take the derivative, or we might want to integrate them. And in general, we can
do this term by term. What do I mean by that? Well, that means that the derivative of f, f prime of x, is just
gonna be the derivative of each of these terms. So that's gonna be the sum
from n equals one to infinity. And let's see, the
derivative of x to the n is n times x to the n minus one. So I could write this as n
times x to the n minus one all of that over n. And these ns will cancel out, so this is just going to be, this is just going to
be x to the n minus one. So this is taking the
derivative with respect to x. Similarly, we could integrate, we could integrate and we could evaluate, we could evaluate the
integral of f of x dx, and this is going to be
equal to some constant plus, if we integrate this term by term. And so this is going
to be equal to the sum from n equals one to infinity. And let's see, we increment the exponent, so x to the n plus one,
and then we divide by that. So times n plus one times
this n right over here. So this is a common
technique that you will see when dealing with power series. And we're gonna go a little
bit more into the details, because you can only do this for x values within the interval of
convergence for the power series. And as we will see, the
interval of convergence for these different series
is slightly different. The intervals are very similar, but what happens at the
endpoint is different. So I encourage you, pause this video, and see if you can figure out
the interval of convergence for each of these series. This is the integral
of our original series, and this is the derivative
of our original series. So let's start with our original series. Let's figure out the
interval of convergence. So we could do that using the ratio test. So the ratio test, we
would want to do the limit, the limit as n approaches
infinity of a sub n plus one, so that's gonna be x to the
n plus one over n plus one, divided by a sub n, so
that's x to the n over n. So we want to take the
absolute value of that. That's gonna be the limit
as n approaches infinity. Let's see, this is, if
you divide this and this by x to the n, that's gonna be a one, and this is just going to be an x, and then this n is going to end up top. So this is going to be xn over n plus one. And this is equal to the limit
as n approaches infinity of, let's see, if we divide the
numerator and denominators here by one over, if we divide by both the numerator
and the denominator by n, we're gonna get x over
one plus one over n. And what is this going to be? Well, this term's gonna go to zero, so this is just gonna be equal
to the absolute value of x. And the ratio test tells us
that this series is convergent if this right over here is less than one, it's divergent if this
is greater than one, and it's inconclusive if this equals one. So we know, let's write that down. We know we are convergent, convergent, convergent for the absolute value of x less
than one when this thing, when it is less than one. We know that we are divergent when this thing is greater than one, when the absolute value
of x is greater than one. But what about when the absolute
value of x is equal to one? That's where the ratio test breaks down and we have to test that separately. So let's look at the scenario where x is equal to one. When x equals to one,
this series is the sum from n equals one to infinity
of one to the n over n. Well, that's just gonna be one over n. This is the harmonic
series or the p-series where our p is one. And we've seen in multiple
videos that this diverges. So when x equals one, we diverge. What about when x equals negative one. When x equals negative one,
this thing becomes the sum from n equals one to infinity of negative one to the n over n. And this is often known as the
alternating harmonic series. And this one by the
alternating series test, this one actually converges. And we've seen that in multiple videos. So it turns out the
interval of convergence for our original thing right over here, our interval of convergence, interval of convergence, convergence here, is we can, x can be, so it could be, x can be greater than or equal to negative one, or I could say negative one
is less than or equal to x, because if x is negative
one, we still converge, but then x has to be less than one, because right at one we diverge, so we can't say less than or equal to. So this is the interval of convergence for our original function. What about the interval of
convergence for this one right over here when
we take the derivative? Well, when we take the derivative, this is, this is the same
thing as x to the zero plus x to the first, plus x to the second, and we go on and on and on. Now you might recognize this, this is a geometric series
with common ratio of x. Geometric series, series, where our common ratio, often noted by r, is equal to x. And we know that a
geometric series converges only in the situation where the, where our common ratio,
where the absolute value of our common ratio, so converges, converges, only in the situation
where the absolute value of our common ratio is less than one. So in this situation, when
we took the derivative for f prime of x, our
interval of convergence is almost the same. So here our interval of convergence is going to be x has to be between negative one and one, but it can't be equal to negative one. At negative one we would actually diverge, and at one we would diverge. So notice, these are almost the same. If we view these as
series centered at zero, the radius of convergence is the same. We can go one above, one
below, one above, one below. And that's in general truths. We take derivatives as integrals. But the endpoints of our interval of convergence can be different. And to continue to see this, I encourage you to use the ratio test to figure out one, what is the, well, use the ratio test plus
using the boundary conditions, figure out what the
interval of convergence is for the antiderivative,
for the integral here. And what you will see is the radius of convergence is the same. We can go one above
zero and one below zero. We have to be in that interval. But as you will see, this one converges for x equals negative one or x equals one. I'll just cut to the chase here. So, interval of, let me
write that in yellow. The interval of convergence
for this top one converges, converges for negative one is less than x, is less than or equal to one. So notice, they all have the
same radius of convergence, but the interval of convergence,
it differs at the endpoint. And if you wanna prove
this one for yourself, I encourage you to use
a very similar technique that we use for our original function. Use the ratio test, you're
gonna come to this conclusion right over here, and then test the cases when x is equal to one and
x is equal to negative one. And you will see when x
is equal to negative one, you have an alternating p-series,
so that's gonna converge. And then when x equals one,
you're gonna have a p-series where the denominator has
a degree larger than one, or something similar to a p-series. And you can establish
that it will also converge in that scenario as well.