Main content

## Null space and column space

Current time:0:00Total duration:21:35

# Proof: Any subspace basis has same number of elements

## Video transcript

Let's say I've got some set
A of the vectors a1, a2, all the way to an. And I know for a fact that
it's a basis for the subspace V. What I want to show you in this
video is that if this guy has n elements right here, that
any set that spans V has to have at least n elements,
[typing] or n members, or cardinality
of n. There's all different ways
of saying you had n vectors in this set. I'm saying that every set that
spans V must have at least n elements, if the sum basis
set has n elements for V. Let's see if we can kind of run
with the set that has less than n elements and see if we
reach any contradictions. So let's say that I have sum set
B here and it's equal to the vectors b1, b2,
all the way to bm. And m is less than n, so I have
sum set of vectors here that have fewer elements
than my set A. So you come to me one day and
say, look I found you this set of vectors right here. And not only does it have
fewer elements than A, but it spans V. I look at you very suspiciously
because I always thought that this green
statement was true. So we start a little bit of
a thought experiment. And I say OK, you claim
that your set spans V, so let's do something. Let me define a new set. Let me call this new set B1
prime, and you'll see why I'm doing this kind of
strange notation. What's essentially going to be
is a set B plus my vector a1. So it's a1, and then I have
all of my elements of B. So be b1, b2, all
the way to bm. Now I think you and I could both
agree that this set is linearly dependent. How do I know that? Linear dependence means that at
least one of the elements of the set can be represented
as a linear combination of the others. Well, we know that a1 is one of
the basis vectors for V for this definition of a basis. But all the basis vectors
are members of V. If this set is a basis for V,
then this means that this set spans V, or that every member
of V can be represented as a linear combination
of these guys. Or in other ways every
linear combination of these guys is in V. And one of the linear
combinations of these guys is you just set the coefficient
on a1 to be 1, and the coefficients on everyone
else to be 0. So obviously a1 is
also in the set. So if a1 is in V, and all
of these guys span V, by definition if these guys span
V some linear combination of these guys can be used to
construct any member of V. So you can take some linear
combination of these guys to construct a1. So you could say a1 is equal to
d1, where the d's are the constants, d1 b1 plus d2 b2, all
the way to dm bm, and at least one of these have
to be non-zero. We know that a is a
non-zero vector. If it was a 0 vector, this
couldn't be a basis because it wouldn't be linearly
independent, because you can always represent a 0 vector
as really just 0 times any other vector. So this won't be a 0 vector. So at least one of these
are non-zero. So let's just say, for the sake
of argument, that dj -- so the coefficient on
bj is non-zero. So dj does not equal 0. So we could actually solve
for that term. So over here someplace you have
the term dj bj, plus a bunch of other stuff. We can solve for this term if we
subtract it from both sides of the equation, and then divide
both sides by minus dj, and put this minus a1 on the
other side, what do we get? I know that was a lot of
operations, but that's just straight up algebra. I think you can say that we
could rewrite this right here. We can solve for bj. That should be equal to minus
1 over its coefficient. And, if we subtract the a1 from
both sides and then plus all of these guys, plus d1 b1
plus all the way -- you're going to have a little
bit of a gap here. I'll just draw it like that. It's very unconventional
notation where this guy sat. All the way to dm bm. I'm doing this all of this to
show you that, by definition, you can write a1 as a
linear combination of these other guys. But you can just rearrange
things. You can rearrange it so that you
can write one of the other guys as a linear combination
of the rest of the other guys and a1. This guy's now redundant. I don't need this guy any longer
to continue to span V. Clearly this set
still spans V. I added an extra vector here. But I can remove this guy right
here from my set b1 prime and still span V. And how do I know that? Because I can achieve him. By removing him, I don't
lose anything. Because if I needed this vector
to create some other vector, I can construct him with
a linear combination of the rest of the b's,
plus my a1. So let's get rid of him,
and call that set v1. And actually, just for the sake
of notation, let me just change his name. This is little unconventional. You won't see it done like
this in any textbook. But I think it's a little bit
easier, instead of having to keep talking about these little
guys that are embedded someplace in the middle
of the stream. I mean these names, b1, b2, bn,
they're arbitrary names. Let me swap the labels. Let me just say that bj is equal
to b1, and that b1 is equal to bj. I'm just swapping their names. I took that guy, I
renamed him b1. I renamed b1, bj, so that
I could swap them. So I'm essentially just going
to remove b1 one from the vector just to make my
notation easier. You could just keep saying I'll
remove this bj from the middle, but it becomes
very confusing then. So let me call my new set after
removing the bj that I've renamed as b1, let me just
call that straight up B1. So my straight set B1 is equal
to a1, and then remember I removed the bj and I renamed
that as b1, and then I renamed b1 as bj. So now my set looks like
this -- let me go to the other color. b2 -- and for all we know
bj might have been b1, we don't know. There are probably multiple of
these that are non-zero, so we could pick any of those
to be our bj. But anyway, we took our
bj renamed it b1, and removed the b1. So now out set looks like this:
b3 all the way to bm. This set still spans V. And we know that because the
guy we removed can be constructed with any linear
combination of these guys. So we haven't lost our ability
to construct all of the vectors in V. Now let me create
another vector. We'll do a new color. Let's say I have the
vector B2 prime. And what I'm going to do here
is now I'm going to take another element from
our basis of V. I'll take the second element;
I'll take a2. I'll take a2 and throw
it on this guy. So now we have the set B2 prime
is equal to -- I'm going to add a2 to this guy. So you have a1, a2, and then you
have all the rest of these guys, b2, b3 all the way bm. Of course this still spans V,
I just added something here. But this is definitely
linearly dependent. Remember I didn't say in the
beginning whether this was linear dependent or not. It may or may not be. But when you add this other
vector that's in V, you definitely know that you're
linear dependent, because these guys can construct
that guy. Similarly we know that this
vector B1 spans V. So when we add this new element
here, we know that it can be written as a linear
combination of the other one. So we know that this right here
is linearly dependent. And we could say that a2 is
equal to some constant times c1 times a1 plus c2 b2 plus c3
b3, all the way to cm bm. Now what I'm going to claim is
that at least one of these coefficients is non-zero. So at least one of the ci's
does not equal to 0. And I'll make the further claim
that there's at least one that's outside
of this one. That at least one of the
coefficients on these B terms has to be non-zero. And the way you can think about
is, what if all of these guys were 0? If all of these guys were 0,
then that means that a2 is a linear combination of a1. All of these guys would cancel
out, and you'd have a2 is equal to some non-zero
constant times a1. We know that's not the case
because these two guys come from the same linearly
independent set. They both come from that
spanning basis. The fact that they are a basis
-- the word spanning basis, I shouldn't say it like that,
because it's redundant. A basis is a spanning set that
is linearly independent. If they're linearly independent
we know that a2 cannot be represented as some
linear combination of the rest of these guys. So we know that one of the
coefficients on the B terms has to be non-zero. And let's just say that you're
going to have a cj bj, this is a different one than
we had before. And we know that this guy, at
least one of them has to be non-zero, because if all of
these guys were non-zero, then you wouldn't be able to say
that this vector and that vector are linearly independent,
because they would be scalar multiples
of each other. So we're going to do
the same exercise. This guy right here, cj bj,
obviously this coefficient is non-zero so we can
solve for our bj. Once again we can say that bj is
equal minus 1 over cj times minus a2 plus c1 a1, all
the way to cm bm. So we have some bj here that can
be represented as a linear combination of the rest of the
people, including our new a2. And so, just like we did before,
let's remove him. Let's take him out of the set. And before I take him
out of the set, I'm going to rename him. Just solely for the purpose of
notational simplicity, I'm just going to rename our bj, b2,
and our b2 is equal to bj. So I'm just rearranging
the names. And I'm going to
remove our b2. Or I'm going to remove
what I now call b2. It was whatever was out here
that could be represented as a linear combination of
everything else, including our new a2. When I remove one of those
terms right here, and I renamed it B2. And now it's equal to a1
a2, and now I have the leftovers of my b's. So I have b3, b4, all
the way to bm. Notice I still have exactly
m elements, and this still spans V. It spans V because the element
that I took out of it can be represented as a linear
combination of these guys. So if I ever want to construct
anything that need that, I can construct that vector with some combination of these guys. So it wasn't necessary. It still spans V. So this process I'm doing, I
can just keep repeating it. I can add an a3. I can define B3 prime. I can just add a3 to this
set right here, a2 a3. And then I have my b3, b4,
all the way to bm. And I'll say this is linearly
dependent because this guy spans V. So everything but this
guy spans V. So obviously you can construct
this guy with a linear combination of the
rest of them. So you could say a3 is equal to
sum a1 plus sum a2 plus c3 b3, all the way to cm bm. And we know that at least one
of the coefficients on the B terms has to be non-zero. Because if all of those were
0, then you would be saying that a3 could be a linear
combination of the A terms. And we know that a3 can't be
a representative linear combination of the A terms,
because all these a terms come from a linearly independent
set. So you do the same operation. Let's say that cj is non-zero. Then you could solve
for that bj. And then I do that little
renaming thing I do, where I rename the bj b3, and rename b3
bj, and then I remove b3. And I get the set B3 is
equal to a1, a2, a3. And I have b4 all
the way to bm. And this still spans V. And I keep doing it. So what's going to happen
eventually? What's going to happen if I keep
doing this process over and over and over again? Eventually I'll essentially
replace all of the bm's. Or I'll replace all of the n
terms, so eventually I'll have a set that looks like this. I'll have a set that looks Bm,
where I will have replaced each of these guys with an a. So I have a1, a2, a3,
all the way to am. You can always do this by
definition if you started with that initial set B that
is a spanning set. And once you do this process
you'll get the same result that this also spans V. Now let me let me write this. This is the result we got by
starting off with a spanning set B that has m elements,
where we said that m is less than n. So we always have enough a
elements to do this, because we have more a elements
than there were b elements to begin with. And we get result that
this spans V. But we already said that the
set A which is equal to a1, a2, all the way to am, and then
am plus 1, I don't know how many more terms there are
between m and n, but then you go all the way to an. Remember we said that
n is greater than m. Or when we defined B, we said
that m is less than n. Same thing, that this
was a smaller set. Now we're saying that this spans
V, but at the same time we said this was a basis. This is just our starting
fact, that this is a basis for V. Basis means two things: it means
it spans V and it means it's linearly independent. Now we just got this result by
assuming that we had some set B that's smaller than this
set here that spans V. We were able to construct this
by saying that a1 through am also spans V. The result we got is
that this spans V. But if this subset of A spans
V, then A becomes linearly independent. Because if this subset spans V,
that means that an can be represented as some linear
combination of these guys. So that implies that you're
linearly dependent, which is a contradiction with our original
statement that set A is a basis for V, because
that means it's linearly independent. If you're able to do this, then
this means that there was some smaller spanning set, you
get the result that A has to be linearly dependant, even
though we said it was linearly independent. So we now know, we get our
contradiction, we say that there cannot be [typing] a spanning set B that has
fewer elements than A. And this is a pretty neat
outcome, because now, if I come up to you and say I
found some set X spans the subspace V again. Then you know that X
has five elements. You now know that no set that
spans the subspace V can have fewer than five elements. Even better, if I told you that
X is a basis for V, and I told you it has five elements,
and Y is a basis for V. [typing] You know that Y also has to have
exactly five elements. How do I know that? Well, if Y is a basis, then that
means that it spans V. And we know it can't
have anything less than five elements. We just proved that. One way we know that Y has to
have greater than or equal to five elements. But on the other hand we know if
Y is a basis for V and X is a basis, X also spans V. So we know that X has to have
fewer elements than Y. So we know that [typing] Y's elements have to be greater
than X's elements, because any spanning set has to
have more elements, or at least as many elements, as a
basis set, X's elements. And then since X is a spanning
set, X's elements have to be greater than or equal
to Y's elements, because why is a basis. But if this guy's elements are
less than this guy's elements, but it's also greater than or
equal to, we know that the number of elements that X has--
X's elements, or the cardinality or the number of
elements in it --is equal to Y's elements. And so now that we know that any
basis for a vector space-- Let me just go back
to our set A. A is equal to a1 a2,
all the way to an. We can now say that any basis
for some vector, for some subspace V, they all have the
same number of elements. And so we can define a
new term called the dimension of V. Sometimes it's written just as
dimension of V, is equal to the number of elements,
sometimes called the cardinality, of any
basis of V. And I went through great pains
in this video to show that any basis of V all has the same
number of elements, so this is well-defined. You can't have one basis that
has five elements and one that has six. By definition they would either
both have to have five, or they both would
have to have six. And so we can define
our dimensionality.