If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains ***.kastatic.org** and ***.kasandbox.org** are unblocked.

Main content

Current time:0:00Total duration:12:34

say I've got some matrix a it's an N by K matrix let's say it's not just any n by K matrix this matrix a has a bunch of columns that are all linearly independent so a1 a2 all the way through a K are linearly independent they are linearly independent columns let me write that down here so a1 a2 all the column vectors of a all the way through a K are linearly independent now what does that mean that means that the only solution to x1 times a1 plus x2 times a2 that's a1 plus all the way to XK times a K but the only solution to this is all of these X's have to be 0 so all x i's must be equal to 0 that's what linear independence implies or another way to write it is hey all the solutions to this equation to this equation x1 x2 all the way down XK equaling the 0 vector that all of the solutions to this are all of these entries have to be equal to 0 this is just another way of writing this right there we've seen it multiple times that's the zero vector right there so if all of these have to be 0 that's like saying that the only solution only solution to ax is equal to 0 is X is equal to the zero vector or another way to say it this is all coming out of the fact that this guy's columns are linearly independent so linear independence of columns of columns based on that we can say since the only solution to ax is equal to 0 is X is equal to the 0 vector we know that the null space of a must be equal to the 0 vector or it's a set with just the 0 vector in it and that is all a bit of review now n by K we don't know its dimensions it may or may not be a square matrix so we don't know necessarily whether invertible and all of that but maybe we can construct an invertible matrix with it so if we take 8 let's study a transpose times a a transpose times a a is an N by K matrix a transpose will be a K by n matrix so a transpose a is going to be a K by K matrix so it's a square matrix so that's a that's a nice work that's a nice place to start for an invertible matrix so let's see if it is actually invertible we don't know anything about a all we know it's as columns are linearly independent let's see if a transpose a is invertible invertible invertible and essentially to show that it's invertible if we could show that all of its columns are linearly independent then we'll know it's invertible if we have any and I'll get back this back to this at the end of the video but if you have a square matrix if you have a square matrix matrix matrix with linearly independent columns remember the linear in the linearly independent columns all all are associated with pivot columns when you put them in reduced row echelon form so if you have a square matrix then you're going to have exactly so if it's a K by K matrix that means you're going to have K that means that the reduced row echelon form of matrix will have K pivot columns have K pivot columns pivot columns and B and K by K and be a square K by K matrix and there's only one K by K matrix with K pivot columns and that's the identity matrix and that is the identity matrix the K by K identity matrix and if when you reduce something to reduced row echelon form and it you get the identity matrix that means that your matrix is invertible that means it's invertible I could have probably left that to the end of the video but I just want to show you if we can show that we already know that this guy is square that a train suppose a is a square matrix if we can show that given that a has linearly independent columns that a transpose times a also has linearly independent columns then given that the columns are linearly dependent and it's a square matrix that tells us that when we put into reduced row echelon form we'll get the identity matrix and that tells us that this thing would be invertible so let's see if we can prove that all of these this guy's columns are linearly independent so let's say I have some vector V let's say my vector V is a member of the null space is a member of the null space of a transpose a that means that if I take a transpose a times my vector V I am going to get the zero vector fair enough now what happens if I multiply both sides of the equation times the transpose of this guy so I'll get V transpose actually let me just do it right here I multiply V transpose on this side and V transpose on this side and this is you could view this as a as a matrix vector product right or in general when you take a you know if you think if you take a a row vector times a cot at times a column vector it's essentially their dot product so this right-hand side of the equation u dot anything with the zero vector this is just going to be that is just going to be the zero vector now what is the left-hand side of this going to be we've seen this before if you have the transpose of we could view this as even though it's the transpose of a vector you could view it as a it is a row vector but you could also view it as a matrix right a column vector is a you know let's say V is a member V is a K by one is a K by one matrix B transpose will be a one by K matrix anyway we've seen this before that that is equal to the reverse product the transpose of the reverse product or if we take the product of two things and transpose it that's the same thing as taking the reverse product of the transposes of either of those two matrices so given that can replace this right here with a times the vector V transpose and we're take we're multiplying this vector we're multiplying this vector times a V times this vector right here and that that is going to be equal to the zero vector now what is this if I'm taking some vectors transpose and let's say you know this is a vector remember this is you know you have a matrix vector product right here when I multiply a matrix times this vector it will result in another vector so this is a vector and this is a vector right here and if I take some vector and I multiply its transpose times that vector we've seen this before that is the same thing as Y dot Y these two statements are identical so this thing right here is the same thing as a V a V dot a V this is the same thing as a V dot a V and so what is the right-hand side equal the right-hand side is going to be equal to zero actually let me just make a correction up here when I take V transpose times the zero vector V transpose is going to have K elements and then the zero vector is also going to have K elements and when I take this product that's like dotting it you're taking the dot product of V and zero so this is the dot product of V with the zero vector which is equal to zero the scalar zero so this right here is the scalar 0 I want to make sure I clarify that it wouldn't made sense otherwise so the right-hand side when I multiply the zero vector times the transpose of V it gets just the number 0 no vector 0 there so this AV dot a V is going to be equal to 0 or we could say that the magnitude the magnitude of a or the length of a V squared is equal to 0 or that tells us that AV has to be equal to 0 the only vector whose length is 0 is a 0 vector so AV let me switch colors I'm using that a little bit too much so we know that a be must be equal to zero to the zero vector right this must be equal to the zero vector since its length is zero now we started off with saying V is a member of the null space is a member of the null space of a transpose a V could be any member of the null space any member of the null space of a transpose a but then we were able but then from that assumption it turns out that V also has to be a member of the null space of a that AV is equal to zero let's write that down if V is a member of the null space of a transpose a then then V is a member of the null space of a now our null space of a because a is columns are linearly independent it only contains one vector it only contains the zero vector so if this guy is a member of the null space of a transpose a and he has a member a member of the null space of a there's only one thing he can be there's only one entry there so then V V has to be equal to the zero vector or another way to say that is any V that's in our in the null space of a transpose a has to be the zero vector or the null space of a transpose a is equal to the null space of a which is equal to just the zero vector sitting there now what does that do for us that tells us that the only solution the only solution to a transpose a x times some vector X equal to zero this says it the only solution only solution solution is the zero vector is let me write is X is equal to the zero vector right because the null space of a transpose a is the same as the null space of a and that just has the zero vector in it the null space is just the solutions to this so if the only solutions to the null space is this that means that the columns of a that means that the columns of a transpose a are linearly independent you could essentially write the call of the linear combinations of the columns by the weights of the entries of X we actually did that at the beginning it's the same argument we used up here so if all of their columns are linearly independent and I set it over here a transpose a has linearly independent columns and it's a square matrix and it's a square matrix that was kind of from the definition of it so that we now know that a transpose a a transpose a if I were to put it let me do it this way that tells me that the reduced row echelon form of a transpose a is going to be equal to the identity the K by K identity matrix which tells me that a transpose a is invertible invertible which is a pretty neat result I started with a matrix that has linearly independent columns so it wasn't just any matrix it wasn't just any run-of-the-mill matrix it did have linearly independent columns but it might have weird dimensions it's not necessarily a squared square matrix but I can construct a square matrix a transpose a with it and we now know that it also has linearly independent columns it's a square matrix and therefore it is invertible