If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Laplacian intuition

A visual understanding for how the Laplace operator is an extension of the second derivative to multivariable functions. Created by Grant Sanderson.

Want to join the conversation?

Video transcript

- [Voiceover] So here I'm gonna talk about the Laplacian. Laplacian. And the Laplacian is a certain operator in the same way that the divergence, or the gradient, or the curl, or even just the derivative are operators. The things that take in some kind of function and give you another function. So in this case, let's say we have a multivariable function like F, that just takes in a two-dimensional input, F of X Y. So you might imagine its graph as being something like this, where the input space is this X-Y plane here, so each of the points, X Y, is a point here, and then the output is just given by the height of that graph. So the Laplacian of F is denoted with a right-side-up triangle, and it's gonna give you a new scalar valued function of X and Y. So it's gonna give you a new function that takes in a two-dimensional input and just outputs a number. And it's kind of like a second derivative, because the way that it's defined is that you take the divergence of the gradient of your function F. So that's kind of how it's defined, the divergence of the gradient of F. And a more common notation that you might see here is to take that upside-down triangle, nabla, dot product, with nabla of F. So remember, if F is a scalar valued function, then the gradient of F gives you a vector field, a certain vector field. But the divergence of a vector field gives you another scalar valued function. So this is the sense in which it's a second derivative. But let's see if we can kind of understand intuitively what this should mean. 'Cause the gradient, if you remember, gives you the slope of steepest descent. So it's a vector field in the input space of X. And each one of the vectors points in the direction that you should walk, such that this graph is kind of a hill on top of you, it tells you the direction you should go to increase your direction the most rapidly. And if that seems unfamiliar, it doesn't make sense, maybe go take a look at that video on gradients and graphs and how they relate to each other. So with the specific graph that I have pictured here, when you have kind of the top of a hill, all of the points around it, the direction that you should walk, is towards the top of that hill. Whereas when you have kind of like the bottom, a little goalie here, all of the directions you should walk to increase the value of the function most rapidly, are directly away from that value, which you might call a local minimum. So let's temporarily get rid of the graph, just so we can look at the gradient field here pretty clearly. And now let's think about what the divergence is supposed to represent. So now the divergence, and again if this feels unfamiliar, maybe go back and take a look at the divergence videos, but that divergence has you imagining that this vector field corresponds to some kind of fluid flow. So you imagine little like water molecules, and at any given moment, they're moving along the vector that they're attached to. So for example, if you had a water molecule that started off kind of here, you would start by going along that vector and then kind of follow the ones near it, and it looks like it kind of ends up in this spot. And a lot of the water molecules seem to kind of converge over there. Whereas over here, the water molecules tend to go away when they're following those vectors away from this point. And when they go away like that, when you have a whole bunch of vectors kind of pointed away, that's an indication that divergence is positive, because they're diverging away. So over here, divergence is positive. Whereas the opposite case, where all of the water molecules are kind of coming in towards a point, that's where divergence is negative. And in another area, let's say it was kinda like this center point, where you have some water molecules that looks like they're coming in, but other ones are going out, and at least from this picture, it doesn't seem like the ones going out are doing so at a faster rate or slower than they are here. This would be roughly zero divergence. So now let's think about what it might mean when you take the divergence of the gradient field of F. So let me kind of clear up the markings I made on top of it here. Points of high divergence, points where it diverges a lot here, why are those vectors pointing away? And if we pull up the graph again, the reason they're pointing away is 'cause the direction of steepest descent is kind of uphill everywhere; you are in a valley. Whereas in the opposite circumstance, where divergence is highly negative, 'cause points are converging towards it, why are they pointing towards it? Well this is a gradient field, so they're pointing towards that spot, because that's where anywhere around it, you should walk towards that spot to go uphill. So in other words, the divergence of the gradient is very high at points that are kind of like minima, at points where everyone around them tends to be higher. But the divergence of the gradient is low at points that look more like maximum points, where when you evaluate the function at all of the points around that input point, they give something smaller. So this Laplacian operator is kind of like a measure of how much of a minimum point is this X Y. You will be very positive when F evaluated at that point tends to give a smaller value than F evaluated at neighbors of that point. But it'll be very negative if when you evaluate F at that point it tends to be bigger than its neighbors. And this should feel kind of analogous to the second derivative in ordinary calculus, when you have some kind of graph of just a single variable function, the second derivative of X will be low, it'll be negative at points where it kind of looks like a local maximum. But over here, the second derivative of x would be positive at points that kind of look like a local minimum. SO in that way, the Laplacian is sort of an analog of the second derivative for scalar valued multivariable functions. And in the next video, I'll go through an example of that.