If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains ***.kastatic.org** and ***.kasandbox.org** are unblocked.

Main content

Current time:0:00Total duration:9:32

- [Voiceover] So I have
the matrix A over here, and A has m rows and n columns, so we could call this an m by n matrix. And what I want to do in this video, is relate the linear independence,
or linear dependence, of the column vectors of
A, to the null space of A. So what, first of all what I am talking about as column vectors? Well as you can see
there's n columns here, and we could view each of those
as an m-dimensional vector. And so, let me do it this way, so you can view this one right over here, we could write that as V one, V one, this next one over here,
this would be V two, V two, and you would have n of these,
because we have n columns, and so this one right
over here would be V n, V sub n. And so we could rewrite A, we
could rewrite the matrix A, the m by n matrix A, I'm bolding it to show
that that's a matrix, we could rewrite it as, so let me do it the same way, so, draw my little brackets there, we can write it, just express it, in terms of its column vectors, cos we could just say
well this is going to be V one for that column, V one for that column, V two for this column, all the way, we're gonna have n columns, so you're gonna have V
n, for the nth column. And remember, each of these,
are going to have m terms, or I should say, m components in them. These are m-dimensional column vectors. Now what I want to do, I said I want to relate the linear
independence of these vectors, to the null space of A. So let's remind ourselves what
the null space of A even is. So the null space of A, the null space of A, is equal to, or I could
say it's equal to the set, it's the set of all vectors x, that are members of R n, and I'm gonna double down
on why I'm saying R n, in a second, such that, such that, if I take my matrix A, if I take my matrix A, and multiply it by one of those x's, by one of those x's, I'm going to get, I'm going
to get the zero vector. So, why do the, why does x
have to be a member of R n? Well just for the matrix
multiplication to work, for this to be, if this is m
by n, let me write this down, if this is m by n, well in order just to make
the matrix multiplication work or you could say the matrix
vector multiplication, this has to be an n by one, an n by one, vector, and so it's gonna have n components, so it's gonna be a member of R n. If this was m by A, well, or,
let me use a different letter, if this was m by, I don't know, seven, then this would be R seven,
that we would be dealing with. So that is the null space. So, another way of thinking about it is, well if I take my matrix A, and I multiply it by sum vector x, that's a member of this null space, I'm going to get the zero vector. So if I take my matrix A, which I've expressed here in
terms of its column vectors, multiply it by sum vector x, so sum vector x, and, actually let me make it clear that, it doesn't have to have the same, so, sum vector x right over here, we draw the other bracket, so this is the vector x, and so, it's going to have, it's a member of R n, so it's going to have n components, you're gonna have x one,
as the first component, x two, and go all the way, to x n. If you multiply, so if we say that this x is a member of the null space of A, then, this whole thing is going to
be equal to the zero vector, is going to be equal to the zero vector, and once again the zero vector, this is gonna be an m by one vector, so it's gonna look, actually
let me write it like this, it's gonna have the same
number of rows as A, so, I'll try to make it, the brackets roughly the same length, so, and there we go, try and
draw my brackets neatly, so you're gonna have m of these, one, two, and then go all the way to the mth zero. So, let's actually just multiply this out, using what we know of
matrix multiplication. And by the definition of
matrix multiplication, one way to view this, if you were to multiply our matrix A times our vector x here, you're going to get the
first column vector, V one, V one, times the first component here, x one, x one, plus, the second component
times the second column vector, x two times V two, V two, and we're gonna do that n times, so plus dot dot dot x sub n times V sub n, V sub n, and these all when you add them together are going to be equal to the zero vector. Now this should be, this, so it's gonna be equal to the zero vector, and now this should start
ringing a bell to you, when we looked at, when we
looked at linear independence we saw something like this, in fact we saw that these
vectors V, V sub one, V sub two, these n vectors, are linearly
independent if and only if, any linear, if and only
if the solution to this, or I guess you could say the
weights on these vectors, the only way to get this to be true is if x one, x two, x
n are all equal zero. So let me write this down. So V sub one, V sub two, all the way to V sub n, are linearly independent,
linearly independent, if and only if, if and only if, only solution, so let me, only solution, or you could say weights on
these vectors, to this equation, only solution is x one, x two, all the way to x n are equal to zero. So if the only solution here, if the only way to get this sum to be equal to the zero vector, is if x one, x two and x,
all the way through x n, are equal to zero, well that means that our
vectors V one, V two, all the way to V n, are
linearly independent, or vice versa, if they're
linearly independent, then the only solution to this, if we're solving for the
weights on those vectors, is if for x one, x two and
x n to be equal to zero. Remember, linear independence,
if you want to say, it's still mathematical, but a little bit more, common language is, if these vectors are linearly independent, that means that none of these
vectors can be constructed by linear combinations
of the other vectors, or, looking at it this way, this right over here is
a, you could view this as a linear combination
of all of the vectors, that the only way to get
this linear combination of all the vectors to be equal to zero, is if x one, x two,
all the way through x n are equal to zero, and we proved that in other
videos on linear independence. Well, if the only solution to this is all of the x one's through
x n's are equal to zero, that means that the null space, this is only going to
be true, you could say, if and only if, the null space of A, the null space of A, let me make sure it looks like
a matrix, I'm gonna bold it, the null space of A only
contains one vector, it only contains the zero vector. Remember, this is, if all
of these are gonna be zero, well then the only solution here is gonna be the zero vector, is going to be, is going
to be the zero vector. So the result that we're showing here is, if the column vectors of a
matrix are linearly independent, then the null space of that matrix is only going to consist
of the zero vector. Or you could go the other way. If the null space of a matrix
only contains the zero vector, well that means that the
columns of that matrix are linearly independent.