If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

# Introduction to eigenvalues and eigenvectors

## Video transcript

for any transformation that maps from RN to RN we've done it implicitly but it's been interesting for us to find the vectors that essentially just get scaled up by the transformations so the vectors that have the form the transformation of my vector is just equal to some scaled up version of the vector and if this doesn't look familiar I can jog your memory a little bit when we were looking for basis vectors for the transformation let me draw it this was a from r2 to r2 from r2 to r2 so let me draw r2 right here now let's say I had the vector to add the vector let's say v1 was equal to the vector 1 2 and we had the line spanned by that vector we did this problem several videos ago and I had the transformation that flipped across this line something called that line el-tee was the transformation from r2 to r2 that flipped vectors across this line so it flipped flipped vectors flipped vectors across L so if you remember that transformation if I had some random vector that look like that let's K that's X that's vector X then the transformation of X looks something like this was just flipped across that line that was the transformation of X and if you remember that video we were looking for a change of basis that would allow us to at least figure out the matrix for the transformation at least an alternate basis and then we could figure out the matrix for the transformation in the standard basis and the basis we picked were basis vectors that didn't get changed much by the transformation or ones that only got scaled by the transformation for example when I took the transformation of v1 when I took the transformation of v1 it just equal to v1 or we could say that the transformation of v1 just equaled 1 times v1 so if you just follow this this little format that I set up here lambda in this case would be 1 and of course the vector in this case is v1 the transformation just scaled up review by one now if you also if you that same problem we had the other vector that we also looked at let's say it was a vector it was the vector - let's say it's the vector V - which is let's say it's 2 - 1 and then if you take the transformation of it since it was orthogonal to the line it just got flipped over like that and that was a pretty interesting vector for us as well because of transformation the transformation of v2 in this situation is equal to what just - V - it's equal to minus V - or you could say that the transformation of v2 is equal to minus 1 times v2 and these were interesting vectors for us because when we defined a new basis with these guys as the basis vector it was very easy to figure out our transformation matrix and actually that basis was very easy to compute with and we'll explore that a little bit more in the future but hopefully you realize that these are interesting vectors there's also the cases where we had the planes spanned by some vectors and then we had another vector that was popping out of the plane like that and we were transforming things by taking the mirror image across this and we're like well in that transformation these these red vectors don't change at all and this guy gets flipped over so maybe those would make for good basis or those would make for good basis vectors and they did so in general we're always interested with the vectors that just get scaled up by transformation and not it's not going to be all vectors right this vector that I drew here this vector X it doesn't just get scaled up it just it it actually gets changed its direction gets changed the vectors that get scaled up might switch direct my goal you know from this direction to that direction or maybe they go from that maybe that's X and then the transformation of X might be a scaled up version of X maybe it's that but they're actual the the actual I guess line that they span will not change and so that's what we're going to concern ourselves with because these have a special name and they have the special names I want to make this very clear because they're useful you know it's not just some mathematical game we're playing although sometimes we do fall into that trap but they're actually useful they're useful for defining bases because in those bases it's easier to find transformation matrices they're more natural coordinate systems and oftentimes the transformation matrices in those bases are easier to compute with and so these have special names this any vector that satisfies this right here is called an eigenvector eigenvector for the transformation T for T and the lambda the multiple that it becomes this is the eigen value eigen value associated associated with that eigenvector that eigen vector so in the example I just gave where the transformation is flipping around this line v1 the vector 1 2 is an eigenvector of our transformation so V so 1 2 is an eigenvector eigenvector and it's corresponding eigenvalue is 1 so eigen value eigen value is 1 this guy is also an eigenvector the vector 2 minus 1 is also an eigenvector a very fancy word but it all it means is a vector that's just scaled up by a transformation it doesn't get changed in any more meaningful way than just a scaling factor and it's corresponding eigenvalue guy again value is minus 1 if this transformation I don't know what its transformation matrix is I forgot what it was we actually figured it out a while ago if this transformation matrix this transformation can be represented as a matrix vector product and it should be it's a linear transformation then any V that satisfies the transformation of al-sayed transformation of V is equal to lambda V which is also would be you know the transformation of you just be a times V a times V these are also called eigenvectors of a because a is just really the matrix representation of the transformation so in this case this would be an eigen vector of a and this would be the eigenvalue associated with the eigenvector eigen value associated with the eigenvector so if you give me a matrix it represents some linear transformation you can also figure these things out now in the next video we're actually going to figure out a way to figure these things out but I want what I want you to appreciate in this video is that you know it's easy to say oh you know the vectors that don't get changed much but I want you to understand what that means they literally just get scaled up or maybe they get reversed but they don't their direction or the lines they span fundamentally don't change and the reason why they're interesting for us is that they well one of the reasons why they're interesting for us is that they make for interesting basis vectors basis vectors whose matrix transformation matrices are maybe computationally more similar a simpler or ones that make for better coordinate systems