What does it mean to take the derivative of a function whose input lives in multiple dimensions? What about when its output is a vector? Here we go over many different ways to extend the idea of a derivative to higher dimensions, including partial derivatives, directional derivatives, the gradient, vector derivatives, divergence, curl, etc.
The gradient is like the king of all partial derivatives. Or perhaps it's more like the country they all live it. It stores all partial derivative information in a single vector-valued function, and as such it is a central tool for analyzing rates of change in multivariable functions.
If you have one function that takes a single number to a high dimensional space, and another which maps that high dimensional space back down to the number line, applying one after the other gives a regular old single-variable function. But how do you find the derivative of this new function?
In the fluid flow interpretation of vector fields, divergence is a measure of how much fluid tends to flow away from each point. However, his turns out to have far-reaching consequences beyond the specific case of fluid flow.
Divergence and curl are two ways of extending the idea of a derivative to vector fields.
If you interpret a vector field as representing a fluid flow, the divergence tells you if the fluid tends to converge near a given point, or if it diverges away from it. The curl, on the other hand, measures rotation in the fluid.