If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Another example of a projection matrix

Figuring out the transformation matrix for a projection onto a subspace by figuring out the matrix for the projection onto the subspace's orthogonal complement first. Created by Sal Khan.

Want to join the conversation?

  • male robot hal style avatar for user Farrukh Saeed
    Just a bit curious if this is a special case. At we have seen that V is a null space of [1 1 1]. What would be the situation if V is not a null space or the equation is something like x1+x2+x3=5.

    Does it mean that cannot use the Proj v(perp) x for simplicity. Does it mean we would need to persist with the method being described in the previous video?
    (5 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Xuxa Kasandimedjo
    What is an orthoplement?? I got this on a test!
    (5 votes)
    Default Khan Academy avatar avatar for user
  • hopper jumping style avatar for user Tyler Tian
    At , Sal said, "And we could actually just take that out. It's a 1 by 1 matrix, which is essentially equivalent to a scalar". But isn't this sloppy notation, as 1x1 matrices aren't the same as scalars? Is this considered bad practice? And if I were to directly "take [scalars] out" of 1x1 matrices, would it always work or are there situations where I need to watch out?
    (2 votes)
    Default Khan Academy avatar avatar for user
    • leaf grey style avatar for user Alex
      A 1 * 1 matrix's purpose is the same as a scalar, i.e. representing a single number. In reality, the operations for a 1 * 1 matrix are so limited that I've never actually seen a "1 * 1 matrix" in my textbook. It's this scalar-but-not-a-scalar, matrix-but-not-a-matrix sort of thing that can be only be added with other 1 * 1's, and can only be multiplied by some matrices (which kind of defeats the importance of scalar multiplication in linear algebra).
      (3 votes)
  • blobby green style avatar for user Nguyễn Mai Chi
    At Khan said the orthogonal complement to V is a line. I dont quite understand since I think other lines parallel to span([1,1,1]) can be the orthogonal complement to V, too. Can someone please explain?
    (2 votes)
    Default Khan Academy avatar avatar for user
  • leaf orange style avatar for user Linus Streetfarmers Blomberg
    Is orthonormal the same as orthogonal complement?
    (1 vote)
    Default Khan Academy avatar avatar for user
    • leaf red style avatar for user Bob Fred
      an orthonormal set is a set of (linearly independent) vectors that are orthogonal to every other vector in the set, and all have length 1 as defined by the inner product. an orthogonal complement is done on a set in an inner product space, and is the set of all vectors that are orthogonal to the original set and is in the inner product space. notice a regular vector space has no definition of orthogonal.
      (2 votes)
  • male robot donald style avatar for user Johan
    At Sal says that the vector v is by definition the projection of x on to v. How is that so? What definition is he referring to?
    (1 vote)
    Default Khan Academy avatar avatar for user
  • leaf grey style avatar for user hugh macdonald
    at , shouldn't the left side be [Dtranspose] x [D]?
    (1 vote)
    Default Khan Academy avatar avatar for user
  • hopper cool style avatar for user Iron Programming
    Hi,

    I am making a 3D program, and here is what I am trying to do;
    1. Since we are looking at the 3D program through a 2D screen, I want to "project" the coordinates of the matrices onto the 2D plane (screen). This will give my program perspective I believe.
    2. So to do that I need to find a subspace that is the plane centered at z = 0 (where x & y are free variables), and then find it's basis so I can plug it into the equation to find the projection.
    3. But, I'm stumped for some reason. I can't seem to do this. Any help?

    Summary; I need to find the basis for the plane centered at (z = 0).

    Here is what I don't understand;
    Should I have the basis be in R2 or R3?
    [1, 0]
    A = [0, 1]


    [1, 0, 0]
    A = [0, 1, 0]
    [0, 0, 0]


    Thanks for your time. :)
    (1 vote)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user zlu1993
    so B = i3- C, is it a pattern for solving projection matrix in higher dimensions? in other words, does it always work in higher dimensions, besides the general projection matrix formula A(AtA)^-1At?
    (1 vote)
    Default Khan Academy avatar avatar for user
    • spunky sam red style avatar for user Bernard Field
      It will always work, provided you use the appropriately sized Identity matrix (In for Rn).
      This is because, in general, an arbitrary vector can be expressed as a unique sum of a vector in a subspace and a vector in its orthogonal complement. This pattern is derived from that property, and that property is valid for Rn.
      (1 vote)
  • leaf green style avatar for user Esther Turcotte
    I don't see how this example is helpful. He just set x2 and x3 as free variables randomly like that, why?? . So what if the set of our subspace is linearly indepent? And what if the condition of our subspace is equal to some constant rather than 0? Without those answers, I can't apply what I learned from this video in real problems or in exams.
    (1 vote)
    Default Khan Academy avatar avatar for user
    • blobby green style avatar for user Ma Fai
      You can check the Null Space video, it show how to use free variable to represent the solution as the sub space. It's linear independent of the N(A). If the condition is equals to 5, then V is not the Null Space. As the above comments, you can't use this way to find Project Matrix
      (1 vote)

Video transcript

Let's say I have a subspace v that is equal to all of the vectors-- let me write it this way-- all of the x1, x2, x3's, so all the vectors like this that satisfy x1 plus x2 plus x3 is equal to 0. So if you think about it, this is just a plane in R3, so this subspace is a plane in R3. And I'm interested in finding the transformation matrix for the projection of any vector x in R3 onto v. So how could we do that? So we could do it like we did in the last video. We could find the basis for this subspace right there. And that's not too hard to do. We could say x1, if we assume that, let's say, that x2 and x3 are kind of free variables, then we could say that x1 is equal to minus x2, minus x3. And then let's just, just so we can write it in, kind of, our parametric form, or if we can write our solution set as the combination of basis vectors, we can say x2 is equal to, let's say it's equal to some arbitrary constant, C2. And let's say that x3 is equal to some arbitrary constant, C3. Then we can say that v, we can rewrite v, we could say that v is-- I'll do it here-- v is equal to the set of all x1's, x2's, and x3's that are equal to C2 times-- so x1 is equal to minus-- let me rewrite this with a C2-- this is equal to C2. This is equal to C3. So x1 is equal to minus C2, minus C3. So x1 is equal to minus 1 times C2, plus C3 times what? Plus C3 times minus 1. And then what is x2 equal to? x2 is just equal to C2. So it's 1 times C2, plus 0, times C3. x3 is just equal to C3, so it's 0 times C2, plus 1, times C3. And so this is another way of defining our subspace. All of the vectors that satisfy this is equal to this definition here. It's all the vectors whose components satisfy, or that lie in this plane, whose entries lie in that plane. And that's for any real numbers right there. Or another way of writing this, is v is equal to the span of the vectors minus 1, 1, and 0, and the vector minus 1, 0, and 1. Just like that. And we know that these are actually a basis for v because they're linearly independent. There's no way I can take linear combinations of this guy and make the second entry equal a 1 here. And likewise there's no way I can take linear combinations of this guy and make this third entry equal a 1 here. So these are also a basis for v. So given, that just using the technique we did before, we could set some vector, we could set some matrix A equal to minus 1, 1, 0, and then minus 1, 0, and 1. And then we can figure out that the projection of any vector x in our 3 onto v is going to be equal to, and we saw this, it's going to be equal to A times the inverse of A transpose A. All of that times A transpose and all of that times x. And you can do it. You have A here. You can figure out what the transpose of A is very easy. You can take A transpose A, then you can invert it. And it'll be very similar to what we did in the last video. It'll be a little less work, because this is a 3 by 2 matrix, instead of a 4 by 2 matrix. But you saw it is actually a lot of work. It's very hairy and you might make some careless mistakes. So let's figure out if there's another way that we can come up with this matrix right here. Now we know that if x is a member of R3, that x can be represented as a combination of some vector v, that is in our subspace, plus some vector w, that is in the orthogonal complement of the subspace, where v is a member of our subspace, and w is a member of the orthogonal complement of our subspace. Now by definition, that right there is the projection of x onto v, and this is the projection of x onto the orthogonal complement of v. So we can write that x is equal to the projection onto v of x, plus the projection onto v's orthogonal complement, or the orthogonal complement of v of x. So this is by definition, that any member of R3 can be represented this way. Now if we want to write this as matrix vector products, and two videos ago I showed you that these are linear transformations. So let me write that here. So they're linear transformations. So they can be written as matrix vector products. You see that right there. Let me define this matrix, I don't know, let me call this matrix T, let me just call it T. And let me do another. Let me do a letter, let me do B. And let's say that the projection onto the orthogonal complement of v of x, let's say that that's equal to some other matrix C, times x. And we know this is a linear transformation, so it can be represented as some matrix C times x. So what are these going to be equal to? Well x, if I want to write it as a linear transformation of x, I could just write it as the 3 by 3 identity matrix, times x, right? That's the same thing as x. That's going to be equal to the projection of x onto v, well that's just the same thing as B times x. And then plus the projection of x onto v's orthogonal complement, well that's just C times x. Plus C times x. And if you want to factor out the x on this side, we know that the matrix vector products exhibit the distributive property, so we could write that the identity matrix times x is equal to B plus C, times x. Or another way to view this equation is that this matrix must be equal to these two matrices. So we get that the identity matrix in R3 is equal to the projection matrix onto v, plus the projection matrix onto v's orthogonal complement. Remember, the whole point of this problem is to figure out this thing right here, is to solve or B. And we know a technique for doing it. You take A transpose, you can do this whole thing, but that might be pretty hairy. But maybe it's easy to find this guy. Maybe, I don't know. It actually turns out in the video, this one will be easy. So if it's easy to find this guy, we can just solve for B. If we subtract C from both sides, we get that B is equal to I, is equal to the identity matrix, minus the transformation matrix for the transformation onto v's orthogonal complement. So let's see what this is. Let's see if we can figure out what C is right there. So let's go back to our original. So remember-- let me rewrite the problem actually-- remember that v was equal to, essentially it's equal to all of the x1's, x2's, x3's that satisfy x1 plus x2, plus x3 is equal to 0. Or another way to say it is that all the x1's, x2's, and x3's that satisfy the equation 1, 1, 1, times x1, x2, x3 is equal to the 0 vector. Or this case it'll just be 0. We could write the 0 vector just like that. So 1 times x1, plus 1 times x2, plus 1 times x3 is going to equal the 0 vector. This is another way to write v. Now all of the x's that satisfied this right here, what is that? This is saying that v is equal to the null space of this matrix right there. The null space of this matrix is all of the vectors that satisfy this equation. So v is equal to the null space-- let me write it this way-- the null space of 1, 1, 1, just like that. Up here we, kind of, figured out v in kind of the traditional way. We figured out that v is the span of these things, but now we know that's the same thing is the null space of 1, 1, 1. These two statements are equivalent. Now we at least had a hunch that maybe, you know, we could figure out, straight up, this B here by doing all of this A transpose and, you know, by doing all of this silliness here. But our hunch is maybe if we could figure out the transformation matrix for the orthogonal complement of v right there, that then we could just apply this, kind of, that we can just solve for B given that the identity matrix minus this guy is going to be equal to B. So let's see if we can figure out the projection matrix, if we can figure out the transformation matrix for the orthogonal projection, for x onto the orthogonal projection of v. So this is v. What is v compliment? v compliment is going to be equal to the orthogonal complement, or v perp is going to be equal to the orthogonal complement of the null space of this matrix right here. Which is equal to what? Remember, the null space, its orthogonal complement-- a null space's orthogonal complement is equivalent to the row space or the column space of A transpose. We saw that multiple times. Or you could say the orthogonal complement of the row space is the null space. We've seen this many, many times before. So the orthogonal complement of this guy is going to be the column space of his transpose. So the column space of the transpose of this guy. So it's 1, 1, 1, just like that. Or we can write that v's orthogonal complement is equal to the span of 1, 1, 1. The column space of this matrix, we only have one column in it, so its column space is going to be the span of that one column. So just to visualize what we're doing here, that original equation for v, that satisfies that, that's just going to be some plane in R3. That is v right there. And now we just figured out what v's orthogonal complement is. It's going to be a line in R3. It's going to be all of the linear combinations of this guy. So it's going to be some line in R3. I haven't drawn it. You know this is going to be tilted more, and so is this, but it's going to be some line. So this is the orthogonal complement of v. So let's see if we can figure out. So remember, the projection-- let me do it this way. So this is the basis for v's orthogonal complement. So let's construct some matrix. I don't know, let me use a letter that I haven't used before. Let me construct some matrix D, whose columns are the basis vectors for the orthogonal complement of v. Well, there's only one basis vector, so it's going to be that. And we learned, in the last video and the video before that, that the projection of any vector in R3 onto v's orthogonal complement is going to be equal to D times D transpose D inverse, times D transpose, times x. Or another way to view it is that this thing right here, that thing right there is the transformation matrix for this projection. That is the transformation matrix. matrix So let's see if this is easier to solve this thing than this business up here, where we had a 3 by 2 matrix. That was the whole motivation for doing this problem. To figure out the projection matrix for v's subspace, we'd have to do this with the 3 by 2 matrix. It seems pretty difficult. Instead, let's find the projection matrix to get to the production onto v's orthogonal complement, which is this. So what is D transpose? So D transpose is just going to be equal to 1, 1, 1. What is D transpose times D? Well, that's D transpose. This is D, just like that. So what is this going to be equal to? This is just the dot product of that and that. 1 times 1, plus 1 times 1, plus 1 times 1, it equals 3. So this thing right here is equal to a 1 by 1 matrix 3. So let's write it down. So this is equal to D-- which is this matrix, 1, 1, 1-- times D transpose D inverse. So D transpose D is just a 1 by 1 matrix. And we're going to have to invert it. Actually, I've never defined the inverse of a 1 by 1 matrix for you just now, so it's just mildly exciting. Times D transpose. So D transpose looks like this, 1, 1, 1. And then all of that's times x. But this is the transformation matrix right there. Now what is the inverse of a 1 by 1 matrix? Now you just have to remember that A inverse times A is equal to the identity matrix. If we're dealing with a 1 by 1 matrix, then I'm just trying to figure out what, let's say, what matrix times 3 is going to be equal to the 1 by 1 identity matrix. So let's say that 3 inverse times 3 has to be equal to the identity matrix. 1 by 1 identity matrix. Well, the only matrix that's going to make this work out, to get this entry I'll just take this guy's entry times that guy's entry, is going to be this guy right here. The inverse of this 1 by 1 matrix has to be the matrix 1/3. 1/3 times 3 is equal to 1. This is almost trivially simple, but this is the inverse, that right there is the inverse matrix, for the 1 by 1 matrix 3. So this right here is just 1/3. And we could actually just take that out. It's a 1 by 1 matrix, which is essentially equivalent to a scalar. So this is going to be equal-- let me just draw a line here-- this thing is equal to 1/3-- actually, I don't want to confuse you. Let me rewrite it. So we get the projection of any vector in R3 onto the orthogonal complement of v, is equal to 1/3, that's 1/3, times the vector 1, 1, 1, times-- sorry, or wait, that is a vector or the matrix 1 on 1-- times that matrix transposed, 1, 1, 1. And then all of that times x. And you can see, this is a lot simpler than if we have to do all of this business with this matrix. That's a harder matrix to deal with. This 1, 1, 1 matrix is very easy. Now what is this going to be equal to? This is going to be equal to 1/3 times, we have a 3 by 1 times a 1 by 3 matrix, so it's going to result in a 3 by 3 matrix. And what do we get? So this first entry is going to be 1 times 1, which is 1. The second entry is going to be 1 times 1, which is 1. The third entry is going to be 1 times 1, which is 1. I think you see the pattern. The second grow, first column, 1 times 1, which is 1. So this is going to be a 3 by 3 matrix of 1's. So just like that we were able to get-- that was a pretty straightforward situation-- we were able to get the projection matrix for any vector in R3 onto v's orthogonal complement. Now, we know that this thing right here is our original C that we said. And we said that the identity matrix-- we wrote it up here. Let me refer back to what I wrote way up here. We said, look, the identity matrix is equal to the transformation matrix for the projection onto v, plus the transformation matrix for the projection onto v's orthogonal complement. Or we can write that the transformation matrix for the projection onto v is equal to the identity matrix minus the transformation matrix for the projection onto v's orthogonal complement. So if we say that the projection onto v of x is equal to B times x, we know that B is equal to the 3 by 3 identity matrix, minus C, and this is C right there. So B is equal to the identity matrix-- so that's just 1, 0, 0, 0, 1, 0, 0, 0, 1-- minus C, minus 1/3, times 1, 1, 1, 1, 1, 1, 1, 1, 1, just like that. And what is this going to be equal to? Let's see, let's, in our heads, multiply this out. All of these entries are going to be 1/3 essentially, if we multiply this out like that. So if we have 1 minus 1/3. I could write it out like that. It's 1/3, 1/3, 1/3. Everything is 1/3. 1/3, 1/3, 1/3, 1/3, 1/3, 1/3, and this just becomes a 1. So 1 minus 1/3 is 2/3. And all of the 1's minus 1/3 are going to be 2/3, so we could just go down the diagonal. And then the 0's minus 1/3 are going to be minus 1/3. Minus 1/3, minus 1/3, minus 1/3. You have minus 1/3, minus 1/3, and minus 1/3. And just like that, we've been able to figure out our projection, our transformation matrix, for the projection of any vector x onto v, by essentially finding this guy first, for finding the transformation matrix for the projection of any x onto v's orthogonal complement. Anyway, I thought that was pretty neat. And you could rewrite this as v equal to 1/3 times 2, 2, 2, 2's along back the diagonals and then you have minus 1's everywhere else. Anyway, see you in the next video.