If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Video transcript

- [Instructor] We know that when we're just multiplying regular numbers we have the notion of a reciprocal. For example, if I were to take two and I were to multiply it by its reciprocal, it would be equal to one. Or if I were to just take a and a is not equal to zero and I were to multiply it by its reciprocal for any a, that is not equal to zero this will also be equal to one. And this is a number that if I multiply times anything I am just going to get that original number. So that's interesting, put in the back of our minds you learned this many, many years ago. Now we also have something that comes out of our knowledge of functions. We know that if there's some function let's call it f(x) that goes from some set, we call that our domain to some other set we call that our range, that in many cases, not all cases so this is the function f that goes from x to f(x). That in many cases, but not always the case there's another function that can take us back. And we call that other function, the inverse of f. So that if you apply the inverse of f to f(x) you're going to get back to where you were. You're going to get back to x. And we also know that it goes the other way around. For example, if you did f of f inverse of x, that too will get us back to x. So the natural question is is there an analog for an inverse of a function, or for reciprocal when we're multiplying when we think about matrices. So let's play with a few ideas. So let's imagine a matrix as a transformation, which we have already talked about it. When we think about matrices as transformations they really are functions. There are functions that are taking one point in a certain dimensional space let's say in the coordinate plane, to another point it transforms a vector to another vector. For example, let's imagine something that does a clockwise 90 degree rotation. And we know how to construct that transformation matrix which really is a function. What it does is, in our transformation matrix we want to say, what do we do with the one zero unit vector? And what also do we do with the zero one unit vector when you do that transformation? Well, if you're doing a 90 degree clockwise turn, then the one zero unit vector is going to go right over here. And so that's going to be turned into the zero negative one vector. So I'll write that right there. And then the zero one vector is going to be turned into the one zero vector. So let me write it down. This is 90 degrees clockwise and then we can think about what 90 degree counter-clockwise would look like you're going counterclockwise your original one zero vector right over here is going to go over here. It's going to become the zero one vector. So we will write that right over here. And then the zero one vector will then become this vector if you're doing a 90 degree counterclockwise rotation it's going to become the negative one zero vector negative one, zero vector. So in theory these two transformations should undo each other. If I do a transformation that first gets 90 degrees clockwise, and then I apply a transformation that's 90 degrees counter-clockwise I should get back to where we began. Now let's see what happens when we compose these two transformations and we know how to do that. We've already talked about it. We essentially multiply these two matrices. If you were to multiply zero, negative one, one, zero times zero, negative one, one, zero. What do we get? Well, let's see these, this top left this is composing two, two by two matrices is equivalent to multiplying them we've seen that in other videos. And so first we will look at this row and this column and that's going to be zero times zero plus one times one. So that is going to be one. They're going to look at this row and this column. So zero times negative one plus one times zero is just going to be zero. And then we're going to multiply this row times each of those columns. So negative one times zero is zero plus zero times one is zero and then negative one times negative one is one plus zero times zero is one. And look what happened when we took the composition of these two matrices that should undo each other we see that it does. It turns into the identity transformation or the identity matrix. We know that this matrix right over here as a transformation it's just going to map everything onto themselves. Now, this is really interesting because if we view these two by two transformation matrices as functions, we've just shown that if we call this say our first function then can call this it's inverse. And actually we use that same language when we talk about matrices. If we call this as being equal to A we would call this as being equal to A inverse. So if I were to take matrix A and I were to multiply that times its inverse I should get the identity matrix, which is right over here. And here I'm speaking in generalities I'm not even just talking about the two by two case. That should be the three by three case the four by four case so on and so forth. And we also know, that I could have defined this bottom one as A and the top one as A inverse. And so the other way should be true as well. A inverse times A should also be equivalent to the identity matrix. And so that's completely analogous to what we saw in these function examples between a function and its inverse because the other day, as we said an end by end matrix can be viewed as a transformation can be viewed as a function. And we also see that it has analogs to just how we think about multiplication. 'Cause here we could do this multiplication as a composition of transformations but we also can just view this as matrix multiplication. And so if we take a matrix and we multiply it by its inverse, that's analogous to taking a number and multiplying by its reciprocal and we get the equivalent of what in the number world would just be one, but in the matrix world is the identity matrix. 'Cause the identity matrix has this nice property that if I were to take the identity matrix and I were to multiply at times any matrix you're gonna get the original matrix again which is what we saw at least within the analog that we saw in the regular number world.