Main content

### Course: AP®︎/College Calculus BC > Unit 10

Lesson 14: Finding Taylor or Maclaurin series for a function- Function as a geometric series
- Geometric series as a function
- Power series of arctan(2x)
- Power series of ln(1+x³)
- Function as a geometric series
- Maclaurin series of cos(x)
- Maclaurin series of sin(x)
- Maclaurin series of eˣ
- Worked example: power series from cos(x)
- Worked example: cosine function from power series
- Worked example: recognizing function from Taylor series
- Maclaurin series of sin(x), cos(x), and eˣ
- Visualizing Taylor series approximations
- Euler's formula & Euler's identity
- Geometric series interval of convergence

© 2024 Khan AcademyTerms of usePrivacy PolicyCookie Notice

# Worked example: power series from cos(x)

Finding a power series to represent x³cos(x²) using the Maclaurin series of cos(x).

## Want to join the conversation?

- At6:36, Sal says we would have had to find the 19th derivative to find the first five, non-zero terms. Why is that the case? I see that the fifth term has x to the 19th power, but by the first derivative we already have x to the 4th power.(13 votes)
- why we can simply put x^3 in the series without effect the convergence (divergence) ?(12 votes)
- Great question! But think about what happens when we multiply some series by x^3. Imagine we have a series: Σ (an)x^n, and you tell me "I have determined that this series converges for x = c, and I'll even tell you that when x = c, the series converges to L." Well, if I multiply the series by x^3, what will happen now when x = c? No problem, the whole thing will certainly converge to (c^3)L. The same type of argument should convince you that if some series diverges when x = c, then x^3 (or x^n for any finite n) times the series will also certainly diverge at x = c.(6 votes)

- my question maybe does not have relevant with this lesson , but pleas help me

(k+1) ! = k! (k+1)

how that's happen?(5 votes)- if you had (3)! that would be 3*2*1, if you had (3+1)! go decrement the term each time, so it would be (3+1)!= (3+1)*((3+1)-1)*((3+1)-2)*((3+1)-3)= (3+1)*3*2*1 which is the same thing as (3+1)*3!, so just replace 3 with k(3 votes)

- Something I just noticed about this technique, we are only creating the Maclaurin series based on the sine and cosine equivalents, evaluating them and their series of derivatives at zero and applying the pattern. But why just the trig functions? Why wouldn't we be plugging x=0 for x^3, or even the x^2 in cos(x^2)?(4 votes)
- Maclaurin/Taylor series require that the function be differentiable infinitely many times (not counting a derivative that is just the constant 0). Thus, polynomials cannot be used for creating Maclaurin/Taylor series.

While Maclaurin/Taylor series don't have to be trig functions, they do have to be something that can be differentiated infinitely many times and must be evaluable at the point under consideration.(11 votes)

- How do you express this in Sigma notation? I have &Sigma (-1)^n * (x)^?/(n+1)!

I guess a more accurate question would be: How do I express the powers of x as a function?(3 votes)- 8 Years Later I think I have the answer! I got Sigma from n = 0 to infinity of (x^(4n+3))*(-1)^n / 2n!. Incorporating the x^2 was easy since you just raise ((x^2)^2) but I was having trouble implementing the x^3. That's when I realized it would need to be added sort of like the expansion of sin(x)'s x term is x^2n+1, because every term in the series is multiplied by x^3.(4 votes)

- Is there any general prof that the Maclaurin series of f(x)=g(x)*h(x) will be exactly the same as the expression we get if we only substitute the Maclaurin series for g(x) and multiply with h(x)? I am not sure this is intuitive to me.

Thank you!(8 votes) - In the video Sal managed to represent the function f(x) with a polynomial. But how do we know that that's the same polynomial we would get if we expanded the function using the Maclaurin series? Is there some rule that says there could only be one polynomial expression of a function? Thanks(6 votes)
- Sal started the whole process by basing the polynomial on the MacClaurin expansion of cos x - so it is safe to say that that
**is**the polynomial you would get using the MacClaurin series.(2 votes)

- Can someone write down the proof for this?(6 votes)
- I just don't understand how cos(x^2) has the same Maclaurin series as cos(x) by just replacing each (X) with (X^2)(5 votes)
- Perhaps you should try it and see whether you get the series converging on the correct answer.

for x = ½π you should get

cos x = 0

cos (x²) = −0.781211892....

So, check a few terms of Maclaurin and see whether you are converging on those values.(2 votes)

- " f''(x)=painful " I love that, does anyone else love that Sal wrote that? He even did that thing where he repeats what he's saying while he writes it for emphasis.(4 votes)

## Video transcript

- [Voiceover] Let's see if we can find the Maclaurin series
representation of f of x, where f of x is equal to x to the third times cosine of x squared. I encourage you to pause the
video and now try to do it. Remember, the Maclaurin series is just the Taylor series centered at zero. Let's say our goal here is
the first five non-zero terms of the Maclaurin series representation, or Maclaurin series approximation of this. I'm assuming you had paused the video and you had attempted to do this. There's a good chance
that you might have gotten quite frustrated when you did this, because in order to find a
Taylor series, Maclaurin series, we need to find the
derivatives of this function, and as soon as you start to do that, it starts to get painful. F prime of x is going to be, let's see, product rule, so it's three x squared
times cosine of x squared plus x to the third times the derivative of this thing, which is going to be two x
times negative sine of x, negative sine of x squared. Just that was be pretty painful. But it's only going to
get more and more painful as we take the second and
third and fourth derivative. We might have to take more, because some of these terms
might end up being zeros. We want the first five non-zero terms. The second derivative, this
is going to be painful. This is going to be painful. Then the third and fourth derivatives are going to be even more painful. So what do we do here? You could just do this. Let's evaluate them each at zero, and then use those for our coefficients, but you're probably guessing correctly that there's an easier way to do this. I will give you a hint. We know what the Maclaurin
series for cosine of x is. We've done that in the previous video. If you want to see that
again, there's another video. Look up "cosine Taylor series
at zero" on Khan Academy, and you'll find that. But we already know from that, and this is one of the most
famous Maclaurin series, we know that this is a g of x. We say g of x is equal to cosine of x. Well, we know how this, what this is like. There are the Maclaurin
series approximation of that is going to be one minus x squared over two factorial, plus x to the fourth over four factorial, minus x to the sixth over six factorial, plus, and I could keep
going on and on and on. You kind of see where this is going. Plus x to the eighth over eight factorial, and it just keeps going, minus, plus, on and on and on and on. But we wanted first five
terms, so this is a start. I know we wanted the first
five terms of this thing, but bear with me. We'll see how this thing right over here is going to be useful. Now that I've given you a
little bit of a reminder on the Maclaurin series
representation of cosine of x, my hint to you is, can we use this to find the Maclaurin series
representation of this, right up here? Remember, all this is, this
is x to the third times ... And I'm just rereading it for you, this is x to the third
times g of x squared. That's a sizable hint. I encourage you to pause the video again and see if you can work through this. Let me rewrite. I'm assuming you had a go at it. Let me rewrite what I just wrote. I just told you that g of x ... or f of x. F of x ... If I wanted to write it as, I guess you could say as a function, or if I want to construct it using g of x, I could rewrite it as x
to the third times ... instead of cosine of x squared, that's the same thing as g of x squared. So x to the third times g, g of x squared. G of x squared. G of x is just cosine. G of x squared is going
to be cosine of x squared. Then you're going to multiply
that times x to the third. Well, can't we just apply this
insight to the approximation? You might be asking that question. My answer to you is,
"Yes, you absolutely can." Notice, when you substitute
your xes for x squared, you'll just get another polynomial. And if you multiply
that by x to the third, you're just going to
get another polynomial. That actually will be the
Maclaurin series representation of what we started off with. We actually will get the
Maclaurin series representation of this thing right over here. So we could say that f of x is approximately equal to x to the third times x to the third times ... Let me get myself some
space right over here. G of x squared. Over here, this is
approximation for g of x, and if we kept going on and on forever, it would be a representation of g of x. So every place where we see an x, let's replace it with an x squared. This is going to be one minus ... So x squared squared is x to the fourth, x to the fourth over two factorial, which is really just two, but I like to put the factorial there because you see the pattern. Plus, if our x is now x squared, x squared to the fourth
power is x to the eighth, x to the eighth power over four factorial, minus x squared to the sixth power is x to the 12th over six factorial, and then plus x squared to the eighth is x to the 16th power
over eight factorial, and of course, we can keep
going on and on and on. Minus, plus. But we just care
about the first five terms, the non-zero terms, and we're just saying this is an approximation, anyway. So we can say that this is going to be approximately equal to, we distribute the x to the third, and I'm going to do it
in magenta just for fun, so distribute the x to the third, we get x to the third, minus x to the seventh over two factorial, plus x to the 11th over four factorial, minus x to the 15th over six factorial, plus x to the 19th ... x to the 19th over eight factorial. That's the first five non-zero terms, and we are done. When you actually see what we got, you realize it would have
taken you forever to do this by, I guess you could say, by brute force, because you would have had
to find the 19th derivative of all of this madness. But when you realize that, "Hey, gee, "if I can just re-express this function "as, essentially, "x to a power times "something that I know
the Maclaurin series of, "especially ..." Well, if you view it this way: if I can write, if I can rewrite ... Let me say it in a less confusing way. If I can rewrite my function so it is equal to some ... Actually, I can even
throw a coefficient here. If it's equal to some A x to the n power, times some function ... Let me just use a new color. Times, purple. Times g of B, B x to some other power, where I can fairly easily, without too much computation, maybe I already know, if I know the Maclaurin series
representation of g of x, if I know what g of x is going to be, then I can do exactly what
I did just in this video. I find the Maclaurin series
representation for g, every place where I saw an x, I replaced it with what I have over here, the B x to the m power,
where m is some exponent. That will give me another polynomial, another power series, and then I multiply it time A x to the n, and that's going to, once again, give me another power series, and that will be the power
series for my original function. Very exciting.