Main content

# Gram-Schmidt process example

## Video transcript

We came up with a process for generating an orthonormal basis in the last video, and it wasn't a new discovery. It's called the Gram-Schmidt process. But let's apply that now to some real examples, and hopefully, we'll see that it's a lot more concrete than it might have looked in the last video. Let's say I have the plane x1 plus x2 plus x3 is equal to 0. This is a plane in R3. So let's just say that subspace V is equal to the plane defined by this guy right here. x1 plus x2 plus x3 is equal to 0. All of the vectors in the subspace, if you take their entries and you add them up, you're going to get 0. So first we need just any basis for v, so let's see if we can come up with that. So if we subtract x2 and x3 from both sides of this equation, we know that x1 is going to be equal to minus x2 minus x3. Or we could say that our subspace V is equal to the set of all of the vectors in R3-- x1, x2 and x3-- that satisfy the equation, let's say, minus-- well, let me write it this way. Let's say that x2 is equal to c1, and x3 is equal to c2. Then this equation would be x1 is equal to minus c1 minus c2. So if we write it that way, then the subspace V is a set of all of the vectors in R3 such that c1 times some vector-- let me write it this way-- c1 times-- let me write it this way-- plus c2 times some other vector, where c1 and c2 are any real numbers, so c1 and c2 are a member of the reals. And so what is x1? x1 is equal to minus c1 minus c2. So x1 is equal to minus 1 c1, minus 1 c2. x2 is just equal to c1. So x2 is equal to 1 times c1 plus 0 times c2. And then x3 is equal to c2, or 0 times c1 plus 1 times c2. So V is essentially the span of these two vectors, all of the linear combinations of these two vectors. That would represent that plane. So let me write it like this. So V is equal to the span of the vectors minus 1, 1, 0, and the vector minus 1, 0, 1. And you can see that these are linearly independent right here. Obviously, there's no linear combination of this guy that can give you a 1 over here, and there's no linear combination of this guy that'll give you a 1 right there. So this is what V is. But what we want, the whole reason why I'm making this video, is to find an orthonormal basis for V. This is just a basis. These guys right here are just a basis for V. Let's find an orthonormal basis. Let's call this vector up here, let's call that v1, and let's call this vector right here v2. So if we wanted to find an orthonormal basis for the span of v1-- let me write this down. Let me define some subspace V1 is equal to the span of just my vector v1. Well, we saw in the last video, if we just divide v1 by its length, then the span of that vector is going to be a unit vector, and it's going to be the same thing as the subspace V1. It's this line in R3. So let's do that. What is the length of v1? The length of v1 is equal to the square root of minus 1 squared, which is 1, plus 1 squared, which is 1, plus 0 squared, which is 0, so it's equal to the square root of 2. So let's define some vector u1 is equal to 1 divided by the length of v1, so 1 over the square root of 2 times v1, times minus 1, 1, 0. Then the span of v1 is just the same thing as the span of u1. And so this would be an orthonormal basis. Just this vector right here would be an orthonormal basis for just the span of v1. But we don't want just the span of v1, we want the span of v1 and v2. Let me just draw it. So right now, we have a basis, if I just do u1. I'm not going to actually draw what this looks like. Maybe it looks something like this, and its span is this entire line in R3. The span of just one vector in Rn is just going to be all the scalar multiples of it or a line in Rn. So this right here is the subspace V1. Now, we have a v2 here, which is linearly independent from this guy, which means it's linearly independent from this guy, because he's just a scaled version of this guy. So v2 is going to look like that. That is v2 right there. This, of course, was our u1. And what we want to do is we want to find a subspace V. I'll call it V2 for now. V2 is equal to the span of v1 and v2, which is the same thing as the span-- anything that's spanned by v1 is also spanned by u1-- the span of u1 and v2. So we want to see everything that could be generated by linear combinations of u1 and v2. And obviously, this thing right here is our plane that we're talking about. The span of these two guys, that is the whole subspace that we're talking about in this problem. So that is equal to V. So once we find this, if we find an orthonormal version for this span, we're done. So how can we do that? Well, if I can find a vector that's orthogonal to all of the linear combinations of this, that if I add up some linear combination of this to that vector, I can get v2, I can replace v2 with that vector. So we can call that vector right there y2, right? If I can determine a y2, this y2 is clearly orthogonal to everything over here, and I can take some vector in v1, in this line, and add it to y2, and I can get to v2. So combinations of these guys are just as good as v2. So this is going to be equal to the span of u1 and y2. Now what is y2 equal to? Well, we saw in the last video, this is just a projection of v2. This vector right here is the projection of v2 onto the subspace V1. And how do we figure out what that-- and then what would y2 be? y2 would be v2 minus that. So y2 is equal to v2 minus the projection of v2 onto v1. Or if, we were to actually write it out, what is that going to be equal to? So it's going to be equal to-- v2 is this vector right here. So it's minus 1, 0, 2. That is v2. v2 minus the projection of v2 onto v1. Well the projection of the vector v2 onto the subspace V1 is just v2, minus 1, 0, 0, dotted with the orthonormal basis for v1. The orthonormal basis for v1 is just u1. And we solved for u1 up here, so that's going to be that dotted with 1 over the square root of 2 times-- let me do that in yellow, actually, just so you can see that this is u1. So dotting it with u1, so dotting it with 1 over the square root of 2 times minus 1, 1, 0-- I like leaving the 1 over the square root of 2 out of there, just to keep things simple-- all of that divided by-- actually, not divided by anything. Because we if we were doing a projection of a line, it would be divided by the dot product of the orthonormal basis with itself, but its length is 1, so we don't have that, and we saw that before. Actually let me-- let me write this a little bit. I'll just move it down. Let me see if I can move it. It's just equal to that guy, right? Let me make the numbers clear. This is v2 minus the projection of v2 onto the subspace 1. So that's just v2 dotted with my orthonormal basis for v1, my first vector in my orthonormal basis. There's only one, so I'm only going to have one term here, and then all of that times my orthonormal basis vector for v1. So 1 over the square root of 2 times the vector minus 1, 1, 0. Now this looks really fancy. This right here is our orthonormal basis for the subspace V1, but what does this simplify to? So this is going to be equal to-- remember, this right here, this piece right there, that's the projection onto v1 of v2. That's what that was right there. So this is going to be equal to the vector minus 1, 0, 1, minus-- now, I can take the 1 over the square root of 2 on the outside. Actually, I can take both of these onto the outside. So 1 over the square root of 2 times 1 over the square root of 2, it's just going to be 1 over 2, right? So this is going to be minus 1/2 times these guys dotted with each other. Let me just write it this way. So what is these guys dotted with each other? It's just going to be a number. Minus 1 times minus 1 is 1, plus 0 times 1, so plus 0, plus 1 times 0, so plus 0. All of that times-- well, we already used this part of it, so we just have this part left over-- times minus 1, 1, and the 0. This was the dot product, and we took the two scaling factors out. When you multiply them you got 1/2, so this is just going to be a 1, which simplifies things. So this is going to be equal to the vector minus 1, 0, 1, minus 1/2 times this, or we could just write-- 1/2 times minus 1 is minus 1/2. We have 1/2 and then we have 0. And so this is going to be equal to minus 1, minus minus 1/2. That's plus 1/2, so that's going to be just minus 1/2. 0 minus 1/2 is minus 1/2. And then 1 minus 0 is just 1. So this right here is our vector y2. And if you combined u1 right here and y2, we are spanning our subspace V. But we don't have an orthonormal basis yet. These guys are orthogonal with respect to each other, but this guy does not have length 1 yet. So to make it equal to length 1, let's replace him. Let's define another vector u2 that's equal to 1 over the length of y2 times y2. So what is the length of y2? The length of y2 is equal to the square root of minus 1/2 squared is 1/4, plus 1 squared. So it's the square root of 1 and 1/2, or 3/2. So it's equal to the square root of 3/2, right? Yeah, this is 1/2 plus 1 is 1 and 1/2, which is 3/2. So it's equal to the square root of 3/2. So if I defined u2, u2 is equal to 1 over the square root of 3/2, or that's the same thing as the square root of 2/3 times y2, which is this guy right here, minus 1/2, minus 1/2, and 1. And I already had defined u1 up here. u1 was right up here. Let me copy and paste. Actually, I think I can just move it down. So I found u1 right here. We now have two vectors that are orthogonal with respect to each other. So if I have the set of u1 and u2, these guys both have length 1. They are orthogonal with respect to each other, and they span V. So this is an orthonormal basis for the plane that we started this video out with: for V. And we're done. We have done the Gram-Schmidt process. These are our new orthonormal basis vectors.