Let's say that I have some set
V that is a subspace in Rn. And just as a reminder,
what does it mean? That's just some set, or some
subset of Rn where if I take any two members of that subset--
so let say I take the members a and b-- they're both
members my subspace. By the fact that this is a
subspace, we then know that the addition of these two
vectors, or a plus b, is also in my subspace. And this is our closure
under addition. And by the fact that it's a
subspace, we also know that if we multiply any member of our
subspace by a scalar -- so the fact that those guys are members
of our subspace -- we also know that if I pick one
of them, let's say a, and I multiply a by some scalar, that
this is also going to be a member of our subspace. And we sometimes call this
closure under scalar multiplication. And then a somewhat redundant
statement is that V, well it must contain the zero vector. And that's true of
all subspaces. V -- let me write it this
way -- the zero vector is a member of V. And it would be the zero vector
with n components here, because V is a subspace of Rn. And why I say that's redundant,
because if I say that any multiple of these
vectors is also in V, I could just set the scalar
to be equal to 0. So this statement kind
of takes the statement into account. But in a lot of textbooks, they
will always write, oh and the zero vector has to
be a member of V. Although, that's kind of
redundant with the closure under scalar multiplication. Fair enough. Now, let's say that I also have
some transformation T. It is a mapping, a function,
from Rn to Rm. What I want to understand, in
this video is, I have a subspace right here, V. I want to understand whether
the transformation of the subspace -- and what
did we call that? We called that the image of our
subspace, or our subset, either way. The image of V under T. In the last video, just to kind
of help you visualize it. How did that work or -- we
had some subset of Rn that looked like this. It was a triangle that looked
something like that. And that was in Rn, this was
actually in R2, it was a triangle that looked something
like that. And we figured out it's
image under T. So we went from R2 to R2. and we had our transformation. And it ended up looking
something like this. If I remember it properly. It ended up looking like a --
gee, I don't remember it fully, but it was like a
triangle that was skewed like this, rotated. So it was a -- actually I think
it was more like -- I think that's right. It was rotated a bit clockwise
like that and it was skewed. But the exact particulars
of that last video aren't what matter. What matters is that you are
able to visualize what an image under transformation
means. It means you take some subset of
R2, all of the vectors that define this triangle
right here. That's some subset of R2. You transform all of them, and
then you get some subset in your codomain. You could call this the image,
because the transformation of that triangle, or if we call
this s, it's equal to the transformation of s. Or you could say it's the image
of-- you can just call it the set s, but maybe it helps
you to visualize-- call it the image of this
triangle under T. Or maybe even a neater way of
thinking about it is, this triangle-- that skewed, rotated
triangle-- this one is the image of this right
triangle under T. I think that might make
a little bit of visual sense to you. And just as a bit of reminder,
in that last video these triangles, these weren't
subspaces. And just as you could take
scalar multiples of some of the vectors that are members of
this triangle, and you'll find that they're not going
to be in that triangle. So this wasn't a subspace, this
was just a subset of R2. All subsets are not subspaces,
but all subspaces are definitely subsets. Although something can be
a subset of itself. I don't want to wander
off too much. But this just helps
you visualize what we mean by an image. It means all of the vectors that
are mapped to, from the members of your subset. So I want to know whether
the image of V under T is a subspace. So in order for it to be a
subspace, if I take the transformation -- let me
find two members of T. Well clearly if I take the
transformation of any members of V, I'm getting members
of the image. Right? So I can write this. Clearly the transformation of
a and the transformations of b, these are both of members
of our images of V under T. These are both members
of that right there. So my question to you is what
is the transformation of a plus the transformation of b? And the way I have written this,
these are two arbitrary members of our image
of V under T. Or maybe I should call
it T of capital V. These are two arbitrary
members. So what is this equal to? Well, we know from our
properties, our definition of linear transformations, the sum
of the transformations of two vectors is equal to the
transformation of the sum of their of vectors. Now, is the transformation
of a plus b, is this a member of TV? Is it a member of our image? Well, a plus b is a member of V,
and the image contains the transformation of all
of the members of V. So the image contains the
transformation of this guy. This guy, a plus b
is a member of V. So you're taking a
transformation of a member of V which, by definition, is in
your image of V under T. So this is definitely true. Now, let's ask the
next question. If I take a scalar multiple of
some member of my image of V under T, or my T of capital
V, right there. If I take the sum scalar,
what is this equal to? By definition for linear
transformation, this is the same thing as a transformation
of the scalar times the vector. Now is this going to
be a member of our image of V under T? Well we know that ca is
definitely in V, right? That's from the definition
of a subspace. This is definitely in V. And so, if this is in V, the
transformation of this has to be in V's image under T. So this is in -- this is
also a member of V. And obviously, you can
set this equal to 0. The zero vector is a member of
V, so any transformation of -- if you just put a 0 here, you'll
get the zero vector. So the zero vector is definitely
-- I don't care what this is, if you multiply
it times 0, you are going to get the zero vector. So the zero vector
is definitely also a member of TV. So we come on the result that
T -- the image of V under T, is a subspace. Which is a useful result
which we will be able to use later on. But this, I guess, might
naturally lead to the question, what if we go --
everything we have been dealing with so far have been
subsets, with the case of this triangle, or subspaces,
in the case of V. But what if I were to take the
image Rn under T, right? This is the image
of Rn under T. Let's think about
what this means. This means, what do we get when
we take any member of Rn, what is the set of all
of the vectors? Then when we take the
transformation of all of the members of Rn, let
me write this. This is equal to the set of the
transformation of all of the x's, where each x
is a member of Rn. So you take each of the members
of Rn and transform them, and you create
this new set. This is the image
of Rn under T. Well, there's a couple of ways
you can think of this. Remember when we defined
-- let's see, T is a mapping from Rn to Rm. We defined this as the domain. All of the possible inputs
for our transformation. And we define this
as the codomain. And remember I told you that
the codomain is essentially part of the definition of
the function or of the transformation, and it's the
space that we map to. It's not necessarily all
of the things that we're mapping to. For example, the image of Rn
under transformation, maybe it's all of Rm or maybe it's
some subset of Rn. The way you can think about it,
and I touched on this in that first video, is-- and
they'll never, or at least the linear algebra books I looked
at, they didn't specify this-- but you can kind of view
this as the range of T. These are the actual members
of Rm that T maps to. That if you take the image of
Rn under T, you are actually finding-- let's say that
Rm looks like that. Obviously it will go
in every direction. And let's say that when
you take-- let me draw Rn right here. And we know that T is a
mapping from Rn to Rm. But let's say when you take
every element of Rn and you map them into Rm, let's say
you get some subset of Rm, let's say you get something
that looks like this. So let me see if I can
draw this nicely. So you literally map every point
here, and it goes to one of these guys. Or one of these guys can be
represented as a mapping from one of these members
right here. So if you map all of them you
get this subset right here. This subset is, this is
T the image of Rn, the image of Rn under T. And in the terminology that
you don't normally see in linear algebra a lot,
you can also kind of consider it its range. The range of T. Now, this has a special name. This is called -- and I don't
want you to get confused -- this is called the image of T. Image of T. This might be a little
confusing, image of T. So this is sometimes written
as just im of T. Now you are a little confused
here, you are like, before when we were talking about
subsets, we would call this the image of R subset under T. And that is the correct
terminology when you're dealing with a subset. But when you take, all of
a sudden, the entire n dimensional space, and you're
finding that image, we call that the image of the actual
transformation. So we can also call this set
right here the image of T. And now what is the
image of T? Well, we know that we can
write any-- and this is literally any-- so T is
going from Rn to Rm. We can write T of x-- we
can write any linear transformation like this-- as
being equal to some matrix, some m by n matrix
times a vector. And these vectors obviously
are going to be members of Rn-- times sum Rn. And what is this? So what is the image -- let
me write it in a bunch of different ways -- what is
the image of Rn under T? So we could write that as T --
let me write it this way. We could write that as T of Rn,
which is the same thing as the image of T. Notice we're not saying under
anything else, because now were saying the image of the
actual transformation. Which we could also write
as the image of T. Well what are these equal to? This is equal to the set of all
the transformations of x. Well all the transformations of
x are going to be Ax where x is a member of Rn. So x is going to be an n-tuple,
where each element has to be a real number. So what is this? So if we write A-- let
me write my matrix A. It's just a bunch of column
vectors, a1, a2. It's going to have n
of these, right? Because it has n columns. And so a times any x is going to
be-- so if I multiply that times any x that's
a member of Rn. I multiply x1, x2, all
the way to xn. We've seen this multiple,
multiple times. This is equal to x1-- the scalar
x1, times a1, plus x2 times a2, all the way
to plus xn times an. And we're saying we want the
set of all of these sums of these column vectors, where x
can take on any vector in Rn. Which means that the elements
of x can take on any real scalar values. So the set of all of these is
essentially all of the linear combinations of the columns
of a, right? Because I can set these guys
to be equal to any value. So what is that equal to? That is equal to, and we
touched on this, or we actually talked about this when
we introduced the idea. This is equal to the
column space of A. Or we just denoted it
sometimes as C of A. So that's a pretty
neat result. If you take -- it's almost
obvious, I mean it's just I'm playing with words a little
bit-- but any linear transformation can be
represented as a matrix vector product. And so the image of any linear
transformation, which means the subset of its codomain,
when you map all of the elements of its domain into
its codomain, this is the image of your transformation. This is equivalent to the column
space of the matrix that you're transformation
could be represented as. And the column space, of course,
is the span of all the column vectors of your matrix. This is just all of the linear
combinations, or the span, of all of your column vectors,
which we do right here. Anyway hope you found that a
little interesting, and you will be able to use these
results in the future.