If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Visualizing Taylor series approximations

The larger the degree of a Taylor polynomial, the better it approximates the function. See that in action with sin(x) and its Taylor polynomials. Created by Sal Khan.

Want to join the conversation?

  • male robot hal style avatar for user Conor McKenzie
    So we can approximate any function using Taylor expanison and derive it's polynomial representation.

    Is it possible to approximate any function using a formula of sine curves?
    (40 votes)
    Default Khan Academy avatar avatar for user
  • female robot grace style avatar for user cjddowd
    I think I get how the Taylor/Maclaurin series works, so can somebody tell me if my thinking is correct:
    The infinite series works because we build into the polynomial an infinite chain of derivatives of the original function. It gives the value of the function at a certain point, and also the rate the function is changing at that point (1st der.), and also the way that change is changing (2nd der.), and the way that the change that changes the rate of change is changing (3rd der.), ad infinitum, so that we essentially have a "seed" that gives us the entire function just from that point. Am I thinking along the right lines?
    (39 votes)
    Default Khan Academy avatar avatar for user
  • purple pi purple style avatar for user Kevin Burri
    I found something quite interesting : If you take the Maclauren series of sin(x) with a finite polynomial, then whatever how small the coefficients are, for a very big x, the biggest power will overcome the others, and p(x) begin to be very big and is going really far away from sin(x). However, if we take only the first approximation, we get p(x)= 0, which is never further than 1 from sin(x) for any x. Isn't it counter-intuitive?
    (7 votes)
    Default Khan Academy avatar avatar for user
    • female robot grace style avatar for user tyersome
      Maybe we can think of this as the cost of being more precise in the center of the function.

      For example, imagine you are trying to get a wire to fit along ripples on a beach. Each time you put a bend in the wire (add a term to the polynomial) you cause the end of the wire to move further away from the surface of the sand ...

      I'll stop this analogy now, before everyone gets too bent out of shape ...
      (6 votes)
  • leafers ultimate style avatar for user Hank
    Why is it that by using the values of the function and its derivatives at zero, we can obtain an approximation of the function at values other than zero? Are the other values of the function somehow "encoded" in the behavior of the curve at x = 0? I can see how that would be the case for simple functions like lines and parabolas, but does that continue to hold for more complex functions? For arbitrary functions?

    In a piecewise function, I can see that this method will fail (I imagine a function that is equal to sin(x) at zero but is something entirely different at other values of x), so there are clearly restrictions on the types of functions for which Taylor/MacLaurin series can be taken. Is it simply a requirement that the function be continuous and differentiable?
    (6 votes)
    Default Khan Academy avatar avatar for user
    • piceratops ultimate style avatar for user Just Keith
      For this to work, the function must be continuous and you must be able to differentiate it infinitely many times.

      There is nothing special about x=0, it is just easier to calculate. You can use any value of x in the domain for a Taylor series.
      (4 votes)
  • blobby green style avatar for user Arez Arazu
    How come I cant find any videos about taylor series and taylor polynomials involving 2 variables f(x,y)? Im sure this has been covered, and some help to find these videos would be highly appreciated. If that is not the case, maybe requesting this is not such a bad idea. Thank you.
    (8 votes)
    Default Khan Academy avatar avatar for user
    • mr pants teal style avatar for user David Nikdel
      Look for information involving applying "Taylor's theorem" to "multivariate functions". I don't know if Sal has done any videos there but it would involve partial derivitives and gradients to approximate the surface. I'm not familair with this, but it's not immediately clear to me that you could approximate a function of N variables using polynomials (or some other N-dimensional building block --what?) but those are some search terms for you. Hope it helps.
      (5 votes)
  • blobby green style avatar for user Obadina Dewee Adewale
    do i assume thst since i know Mac's series i now Taylor's series? do i just take my derivative at any point and i will arrive at the Taylor's approximate?
    (4 votes)
    Default Khan Academy avatar avatar for user
    • leaf blue style avatar for user Matthew Hochberg
      No, you just know the Taylor series at a specific point (also the Maclaurin series) or, to be more clear, each succeeding polynomial in the series will hug more and more of the function with the specified point that x equals being the one point that every single function touches (in the video above, x equals 0). The Taylor series is generalized to x equaling every single possible point in the function's domain. You can take this to mean a Maclaurin series that is applicable to every single point; sort of like having a general derivative of a function that you can use to find the derivative of any specific point you want. Check out "Generalized Taylor Series Approximation" for a better explanation.
      (3 votes)
  • leaf green style avatar for user glen villanueva
    how would you approximate piecewise functions then? like f(x) = sin(x) when x>0 and cos(x) when x<0?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Danny Taehyun Kim
    Would this chapter (sequences,series, and function approximation specifically taylor series) be helpful or give a intuition to learn the Fourier Series & Transform which is a subject in the engineering mathmetic course..?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • piceratops seedling style avatar for user Leon Overweel
    Is there a simple way of determining at which points the approximation to the nth degree is exactly equal to the given function? Like, at n=1, the sin(x) approximation is equal to sin(x) only at x=0, so that's once. Is there a formula that tells you how many points are exactly equal given just the nth degree you're approximating to?
    (2 votes)
    Default Khan Academy avatar avatar for user
  • old spice man green style avatar for user Gavriel Feria
    What's the O(x^10) at the end of the series expansion?

    -TX520
    (2 votes)
    Default Khan Academy avatar avatar for user

Video transcript

I've talked a lot about using polynomials to approximate functions, but what I want to do in this video is actually show you that the approximation is actually happening. So right over here-- and I'm using WolframAlpha for this. It's a very cool website. You can do all sorts of crazy mathematical things on it. So WolframAlpha.com-- I got this copied and pasted from them. I met Steven Wolfram at a conference not too long ago. He said yes, you should definitely use WolframAlpha in your videos. And I said, great. I will. And so that's what I'm doing right here. And it's super useful, because what it does is-- and we could have calculated a lot of this on our own, or even done it on a graphic calculator. But you usually can do it just with one step on WolframAlpha-- is see how well we can approximate sine of x using-- you could call it a Maclaurin series expansion, or you could call it a Taylor series expansion-- at x is equal to 0 using more and more terms. And having a good feel for the fact that the more terms we add, the better it hugs the sine curve. So this over here in orange is sine of x. That should hopefully look fairly familiar to you. And in previous videos, we figured out what that Maclaurin expansion for sine of x is. And WolframAlpha does it for us as well. They actually calculate the factorials. 3 factorial is 6, 5 factorial is 120, so on and so forth. But what's interesting here is you can pick how many of the approximations you want to graph. And so what they did is if you want just one term of the approximation-- so if we didn't have this whole thing. If we just said that our polynomial is equal to x, what does that look like? Well, that's going to be this graph right over here. They tell us which term-- how many terms we used by how many dots there are right over here, which I think is pretty clever. So this right here, that is the function p of x is equal to x. And so it's a very rough approximation, although for sine of x, it doesn't do a bad job. It hugs the sine curve right over there. And then it starts to veer away from the sine curve again. You add another term. So if you have the x minus x to the third over 6. So now you have two terms in the expansion. Or I guess we should say we were up to the third-order term, because that's how their numbering the dots. Because they're not talking about the number of terms. They're talking about the order of the terms. So they have one dot here, because we have only one first-degree term. When we have two terms here, since we-- when you do the expansion for sine of x, it doesn't have a second-degree term. We now have a third-degree polynomial approximation. And so let's look at the third-degree. We should look for three dots. That's this curve right over here. So if you just have that first term, you just get that straight line. You add the negative x to the third over 6 to that x. You now get a curve that looks like this. And notice it starts hugging sine a little bit earlier. And it keeps hugging it a little bit later. So once again, just adding that second term does a pretty good job. It hugs the sine curve pretty well, especially around smaller numbers. You add another term. And now we're at an order five polynomial, right over here. So x minus x to the third over 6 plus x to the fifth over 120. So let's look for the five dots. So that's this one right over here-- one, two, three, four, five. So that's this curve right over here. And notice it begins hugging the line a little bit earlier than the magenta version, and it keeps hugging it a little bit longer. Then it flips back up like this. So it hugged it a little bit longer. And you can see I'll keep going. If you have all these first four terms, it gives us a seventh degree polynomial. So let's look for the seven dots over here. So they come in just like this. And then once again, it hugs the curve sooner than when we just had the first three terms. And it keeps hugging the curve all the way until here. And then the last one. If you have all of these terms up to x to the ninth, it does it even more. You start here. It hugs the curve longer than the others. And goes out. And if you think about it, it makes sense, because what's happening here is each successive term that we're adding to the expansion, they have a higher degree of x over a much, much, much, much larger number. So for small x value-- so when you're close to the origin for small x values, this denominator is going to overpower the numerator, especially when you're below 1. Because when you take something that has absolute value less than 1 to a power, you're actually shrinking it down. So when you're close to the origin, these latter terms don't matter much. So you're kind of not losing some of the precision of some of the earlier terms. When these tweaking terms come in, these come in when the numerator can start to overpower the denominator. So this last term, it starts to become relevant out here, where all of a sudden x to the ninth can overpower 362,880. And the same thing on the negative side. So hopefully this gives you a sense. We only have one, two, three, four, five terms here. Imagine what would happen if we had an infinite number of terms. I think you get a pretty good sense that it would kind of hug the sine curve out to infinity. So hopefully that makes you feel a little bit better about this. And for fun, you might want to go type in-- you can type in Taylor expansion at 0 and sine of x, or Maclaurin expansion or Maclaurin series for sine of x, cosine of x, e to the x, at WolframAlpha.com. And try it out for a bunch of different functions. And you can keep adding or taking away terms to see how well it hugs the curve.