- [Voiceover] So in
the last couple videos, I talked about partial derivatives of multivariable functions. And here, I want to talk about
second partial derivatives. So I'm gonna write some kind
of multivariable function. Let's say it's well, sine of x times y squared. Sine of x multiplied by y squared. And if you take the partial derivative, you have two options, given
that there's two variables. You can go one way and say what's the partial derivative? Partial derivative of f with respect to x. And what you do for that, x looks like a variable as far as this direction is concerned. Y looks like a constant. So we differentiate this by saying the derivative of sine of x is cosine x. You know, you're differentiating
with respect to x. And then that looks like it's
multiplied by a constant. So you just continue
multiplying that by a constant. But you could also go another direction. You could also say, you know, what's the partial
derivative with respect to y? And in that case, you're
considering y to be the variable. So here it looks at y. And says y squared looks like a variable. X looks like a constant. Sine of x then just looks
like sine of a constant, which is a constant. So that will be that constant, sine of x, multiplied by the derivative of y squared. Which is gonna be two times y. Two times y. And these are what you might call first partial derivatives. And there's some alternate
notation here, df dy. You could also say f and
then a little subscript y. And over here, similarly, you'd say f with a little subscript x. Now each of these two functions, these two partial derivatives that you get are also multivariable functions. They take in two variables
and they output a scalar. So we can do something very similar, where here you might then apply the partial derivative with respect to x to that partial derivative
of your original function with respect to x, right. It's just like a second
derivative in ordinary calculus, but this time we're doing it partial. So when you do it with respect to x, cosine x looks like cosine of a variable. The derivative of which is negative sine times that variable. And y squared here just
looks like a constant. So it just stays constant at y squared. And, similary, you could go down a different branch of options here. And say what if you did
your partial derivative with respect to y? Of that whole function, which itself is a partial
derivative with respect to x. And if you did that, then y squared now
looks like the variable. So you're gonna take
the derivative of that, which is two y. Two y. And then what's in front of
it just looks like a constant as far as the variable y is concerned. So that stays as cosine of x. And the notation here. First of all, just as in
single-variable calculus, it's common to kind of
do a abusive notation with this kind of thing and write partial squared of f divided by partial x squared. And this always, I don't know. When I first learned about these things, they always threw me off because here, this Leibniz notation, you have the great intution of, you know, nudging the x and nudging the f. But you kind of lose
that when you do this. But it makes sense if you think of this partial, partial x as being an operator and you're just applying it twice. And over here, the way
that that would look, it's a little bit funny. Because you still have that
partial squared f on top. But then on the bottom, you write partial y, partial x. And, you know, I'm putting
them in these order just because it's as if I
wrote it that way, right. This reflects the fact that
first I did the x derivative. Then I did the y derivative. And you could do this on this side also. And this might feel tedious, but it's actually kind of worth doing for a result that we end up seeing here that I find a little bit
surprising, actually. So here, if we go down the path of doing, in this case,
like a partial derivative with respect to x. And, you know, you're thinking of this as being applied to your
original partial derivative with respect to y. It looks here, it says sine
of x looks like a variable. Two y looks like a constant. So what we end up getting is derivative of sine of x, cosine x. Multiplied by that two y. And a pretty cool thing
worth pointing out here that maybe you take it for granted. Maybe you think it's as surprising as I did when I first saw it. Both of these turn out to be equal, right. Even though it was a very different way that we got there, right? You first take the partial
derivative with respect to x and you get cosine x, y squared. Which looks very different
from sine x, two y. And then when you take the derivative with respect to y, you know, you get a certain value. And when you go down the other path, you also get that same value. And maybe the way that you write this is that you'd say Let me just copy this guy over here. And what you might say is that the partial derivative of f. When you do it the other way around, when instead of doing x and then y, you do y and then x. Partial x. That these guys are equal to each other. And that's a pretty cool result. And maybe in this case, given that the original function just looks like the product of two things, you can kind of resaon
through why it's the case. But what's surprising is that
this turns out to be true for, I mean, not all functions. There's actually a certain criterion. There's a special theorem, it's called Schwarz's theorem. Where if the second partial derivatives of your function are continuous
at the relevant point, that's the circumstance
for this being true. But for all intents and purposes, the kind of functions you
can expect to run into, this is the case. This order of partial
derivatives doesn't matter. Truth turns out to hold. Which is actually pretty cool. And I'd encourage you to play around with some other functions. Just come up with any
multivariable function, maybe a little bit more complicated than just multiplying two
separate things there, and see that it's true. And maybe try to convince yourself why it's true in certain cases. I think that ought to actually
be a really good exercise. And just before I go, one thing that I should probably mention, a bit of notation that
people will commonly use. With the second partial derivative, sometimes instead of saying partial squared f, partial x squared, they'll just write it as
partial and then x, x. And over here, this would be partial. Let's see, first you
did it with x, then y. So over here you do it first x and then y. Kind of the order of these reverses. Because you're reading left to right. But when you do it with this, you're kind of reading right to left for how you multiply it in. Which would mean that this guy, let's see, this guy over here. Now he would be partial. First you did the y, and then you did the x. So those two guys are
just different notations for the same thing. I mean, that can make it a
little bit more convenient when you don't want to write out the entire partial squared f
divided by partial x squared or things like that. And with that, I'll call it an end.