If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content
Current time:0:00Total duration:8:04

Video transcript

when you have a multi variable function something that takes in multiple different input values and let's say it's just outputting a single number a very common thing you want to do with an animal like this is maximize it maximize it and what this means is you're looking for the input points the values of x and y and all of its other inputs such that the output f is as great as it possibly can be now this actually comes up all the time in practice because usually when you're dealing with a multi variable function it's not just for fun and for dealing with abstract symbols it's because it actually represents something so maybe it represents like profits of a company maybe this is a function where you're considering all the choices you can make like the wages you give your employees or the prices of your goods or the you know amount of debt that you raise for capital all sorts of choices that you might make and you want to know what values should you give to those choices such that you maximize profits you maximize the thing and if you have a function that models these relationships there are techniques which I'm about to teach you that you can use to maximize this another very common setting more and more important these days is that of machine learning and artificial intelligence we're often what you do is you assign something called a cost function to a task so maybe you're trying to teach a computer how to understand audio or how to read handwritten text what you do is you find a function that basically tells it how wrong it is when it makes a guess and if you do a good job designing that function you just need to tell the computer to minimize so that's kind of the flip side right instead of finding the maximum to minimize a certain function and if it minimizes this cost function that means that it's doing a really good job at whatever task you've assigned it so a lot of the art and science of machine learning and artificial intelligence comes down to what one finding this cost function actually actually describing difficult tasks in terms of a function but then applying the techniques that I'm about to teach you to have the computer minimize that and a lot of time and research has gone into figuring out ways to basically apply these techniques but really quickly and efficiently so first of all on a conceptual level let's just think about what it means to be finding the maximum of a multi variable function so I have here the graph of a two variable function that's something that has a two variable input that we're thinking of as the XY plane and then its output is the height of this graph and if you're looking to maximize it basically what you're finding is this peak kind of the tallest mountain in the entire area and you're looking for the input value the point on the XY plane directly below that peak because that tells you the values of the inputs that you should put in to maximize your function so how do you go about finding that well this is perhaps the core observation in more calculus not just multivariable calculus this is similar in the single variable world and their similarities in other settings but the core observation is that if you take a tangent plane at that peak so let's just draw in a tangent plane at that peak it's going to be completely flat but let's say you did this at a different point right because if you tried to find the tangent plane not at that point but you kind of moved it about a bit to somewhere that's not quite a maximum if the tangent plane has any kind of slope to it what that's telling you is that if you take very small directions kind of in the direction of that upward slope you can increase the value of your function so if there's any slope to the tangent plane you know that you can walk in some direction to increase it but if there's no slope to it if it's flat then that's a sign that no matter which direction you walk you're not going to be significantly increasing the value of your function so what does this mean in terms of formulas well if you kind of think back to how we compute tangent planes and if you're not very comfortable with that now would be a good time to take another look at those videos about tangent planes the slope of the plane in each direction so you know this would be the slope in the X direction and then if you look at it from another perspective this would be the slope in the Y direction each one of those has to be 0 and that in terms of partial derivatives means the partial derivative of your function at whatever point you're dealing with right so I'll call it X naught Y naught as the point where you're inputting this has to be 0 and then similarly the partial derivative with respect to the other variable with respect to Y at that same point has to be 0 and both of these have to be true because let's just take a look I don't know let's slide it over a little bit here this tangent plane if you look at the slope you imagine walking in the Y direction you're not increasing your value at all the slope in the Y direction would actually be 0 so that would mean the partial derivative with respect to Y would be 0 but with respect to X when you're moving in the X direction here the slope is clearly negative because as you take positive steps in the X Direction the height of your tangent plane is decreasing which corresponds to if you take tiny steps on your graph then the height will decrease in a manner proportional to the size of those tiny steps so what this gives you here is going to be a system of equations where you're solving for the value of x naught and y naught that satisfies both of these equations and in future videos I'll go through specific examples of this for now I just want to give a good conceptual understanding but one very important thing to notice is that just because this condition is satisfied meaning your tangent plane is flat just because that's satisfied doesn't necessarily mean that you've found the maximum that's just one requirement that it has to satisfy but for one thing if you found the tangent plane at other little peaks like this guy here this guy here or all of the little bumps that go up those tangent planes would also be flat and then those little bumps actually have a name because this because this comes up a lot they're called they're called local minima or local Maxima sorry so those guys are called local Maxima Maxima is just the plural of maximum and local means that it's relative to a single point so it's basically if you walk in any direction when you're on that little peak you'll go downhill so relative to the neighbors of that little point it is a maximum but relative to the entire function you know these guys are the the shorter mountains next to Mount Everest but there's also another circumstance where you might find a flat tangent plane and that's at the minimum points right if you have the global minimum the absolute smallest are also just the local minima these inverted Peaks you'll also find flat tangent planes so what that means first of all is that when you're minimizing a function you also have to look for this requirement where all the partial derivatives are zero but it mainly just means that your job isn't done once you've done this you have to do more tests to check whether or not what you found is a local maximum or a local minimum or a global maximum and these requirements by the way often you'll see them written in a more succinct form where instead of saying all the partial derivatives have to be 0 which is what you need to find they'll write it in a different form where you say that the gradient of function f which of course is just the vector that contains all those partial derivatives its first component is the partial derivative with respect to the first variable its second component is the partial derivative with respect to the second variable and if there is more variables you would keep going you'd say that this whole thing has to equal the zero vector the vector that has nothing but zeros as its component and it's kind of a common abuse of notation people will just call that zero vector zero and maybe they'll emphasize it by making it bold because the number zero is not a vector and often making things bold emphasizes that you want to be referring to a vector but this gives a very succinct way of describing the requirement you're just looking for where the gradient of your function is equal to the zero vector and that way you can just write it on one line but in practice every time that you're expanding that out what that means is you find all of the different partial derivatives so this is really just a matter of notational convenience and using less space on a blackboard but whenever you see this that the gradient equals zero what you should be thinking of is the idea that the tangent plane the tangent plane is completely flat and as I just said that's not enough because you might also be finding local maxima or minima points but in multivariable calculus there's also another possibility a place where the tangent plane is flat but what you're looking at is neither a local maximum nor a local minimum and this is the idea of a saddle point which is new to multivariable calculus and that's what I'll be talking about in the next video so I will see you then