If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content
Current time:0:00Total duration:9:19

Proof of formula for determining eigenvalues

Video transcript

I've got a transformation t-that's a mapping from RN to RN it can be represented by the matrix a so the transformation of X is equal to a times X we saw in the last video it's interesting to find the vectors that only get scaled up or down by the transformation so we're interested in the vectors where I take the transformation of some special vector V and equals of course a times V and we say it only gets scaled up by some factor lambda times V and these are interesting because they make for interesting basis vectors that you know the transformation matrix and the alternate basis this is one of the basis vectors might be easier to compute might be for good coordinate systems but they're in general interesting and we call vectors V that satisfy this we call them I ghen vectors eigen vectors and we call their scaling factors the eigen values eigen values associated with this transformation and that eigen vector and we you know hopefully from that last video we have a little bit of appreciation of why they're useful but now in this video let's at least try to determine what some of them are you know basically what we know so far if you show me an eigen vector I can I can verify that it definitely is the case or an eigen value I could verify the case but I don't know how to matic way of solving for either of them so let's see if we can come up with something so in general we're looking for solutions to the equation a times a times V is equal to lambda V it's equal to lambda times the vector now one solution might immediately pop at you pop out it U and that's just B is equal to the zero vector and that definitely is a solution although it's not normally considered to be an eigenvector just because one it doesn't it's not a useful basis vector it doesn't add anything to a basis it doesn't you know add really the amount of vectors that you can span when you throw the basis vector in there and also it's not clear what is your eigen value that's associated with it because if V is equal to 0 any eigen value will work for that so normally when we're looking for eigen vectors we start with the assumption that we're looking for our nonzero vectors so we're looking for vectors that are not equal to the zero vector so given that let's see if we can play around with this equation a little bit and see if we can at least come up with eigen values maybe in this video so we subtract AV from both sides we get the zero vector is equal to lambda V minus a times V now we can rewrite we can rewrite V as V is just the same thing as the identity matrix times V right V is a member of RN the identity matrix n by n you just multiply we're just going to get V again so if I rewrite V this way at least on this part of the expression let me swap sides so then I'll get lambda times set of V I'll write the identity matrix the N by n identity matrix times V minus a times V is equal to the zero vector now I have one matrix times V minus another matrix times V matrix vector products they have the distributive property so this is equivalent to the matrix lambda times the identity matrix minus a times the vector V and that's going to be equal to zero right this is just some matrix right here some matrix and the whole reason why I made the substitution is so that I could write this as a matrix vector product instead of just a scalar vector product and that way I was able to essentially factor out the V and just write this whole equation as essentially some matrix vector product is equal to 0 now in order if we assume that this is the case and we're assuming remember we're assuming that V does not equal to 0 so what does this mean so we know that V we know that V is a member of the null space of this matrix right here let me write this down V is a member of the null space of lambda I sub n minus a I know that might look a little convoluted to you right now but just you know imagine this is sub-matrix B it might make it simpler this is just some matrix here right that's B let's make that substitution then this equation just becomes B V is equal to zero now if we want to look at the null space of this the null space of B the null space of B is all of the vectors X that are a member of RN such that B times X is equal to zero well V is clearly one of those guys right because B times V is equal to zero we just came we're assuming B salt V solves this equation that gets us all the way to the assumption that B must solve this equation and V is not equal to zero so V is a member of the null space and this is a non-trivial member of the null space we already said that the zero vector is always going to be a member of the null space and it would make this true but we're assuming V is nonzero we're only interested in nonzero eigen vectors and that means that this guy's null space has to be non-trivial so this means that the null space of lambda I n minus a is non-trivial non-trivial the zero vector is the only is not the only member you might remember before that the only time the only time we write this in general if I have some matrix I don't have all used a and B let's say I have some matrix D D is DS these columns these columns are linearly independent if and only if the null space of the null space of D the null space of D only contains the zero vector right so if we have some matrix here whose null space does not only contain the zero vector then it has linearly dependent columns it has linearly dependent columns so we know that and I just wrote that there to kind of show you what we do know and the fact that this one doesn't have a trivial null space tells us that we're dealing with linearly dependent columns so lambda I n minus a it looks all fancy but this is just a matrix must have linearly dependent columns or another way to say that is you know if you have linearly dependent columns you're not invertible you're not invertible which also means that your determinant must be equal to zero all of these are true if your determinant is equal to zero you're not going to be invertible you're not you're gonna have linearly dependent columns if your determinant is equal to zero then that also means that you have non-trivial non-trivial members in your null space and so if your determinant is equal to zero that means there's some lambdas there's some lambdas for which this is true for non zero vectors V so if there are some solutions if there are some non-zero vector V s that satisfy this equation then then this matrix right here must have must have must have a determinant of 0 and it goes the other way if this guy has a determinant of 0 then there must be or if there's some lambdas that make this guy have a determinant of 0 then those lambdas are going to satisfy this equation and you can go the other way if there's some land doesn't satisfy this and those lambdas are going to make this matrix have a 0 determinant so the determinant let me write this so if and only if so I'll let me write this a a v is equal to lambda V for non non zero V for non zero V's if and only if and only if and only if the determinant the determinant of lambda I n minus a is equal to the zero vector is equal to no not the zero vector sorry it's just equal to 0 the determinant is just a scalar factor and so that's our big takeaway note we're saying now how is that useful for me Sal you know we did all of this manipulation I talked a little bit about the null spaces and my big takeaway is that in order for this to be true for some nonzero vectors V then lambda has to be some value such that if I take the determinant of lambda times the identity matrix minus a it has got to be equal to zero and the reason why this is useful is that you can actually set this equation up for your matrices and then solve for your lambdas and we're going to do that in the next video