If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Proof of formula for determining eigenvalues

Proof of formula for determining Eigenvalues. Created by Sal Khan.

Want to join the conversation?

• Hmm - do more people see λI-A instead of A-λI? That's the way my professor is teaching it, and the way that David Lay's book lays it out.
• I see that way in college too, and that's the way it appears in Steinbruch's, Boldrini's and most of the books I checked.
But it makes no difference, it's just algebraic manipulation. You can do whatever you feel more confident to do.
• Heh, I know this might sound silly but what is the null space, and what does it mean by non-trivial? And how can we assume this information (although upon answering the first question, I may be able to figure it out myself)?
This may sound like a trolling question, but I'm serious. Where I've been going to school, the maths teachers seem to take more notice of how maths is done (i.e the formulas, ways of working out equations) more than what it means to be doing it (or how it can be applied). As for mathematical terms, we don't hear of them often as they are not seen as a big concern here. Working out maths is more important. If someone could help me out with this, it would be great
• Null space of a matrix A is the set of all vectors v which satisfy the equation Av=0 (it is mentioned at about ), non-trivial in this context means the solution v=0 to the previous equation.
• So 2 questions.

1 - Do eigenvalues (and eigenvectors) only exist for an "n x n " matrix.

2 - Do eigenvalues (and eigenvecotors) only exist for a a matrix where the determinant is 0?
• 1. Yes, eigenvalues only exist for square matrices. For matrices with other dimensions you can solve similar problems, but by using methods such as singular value decomposition (SVD).

2. No, you can find eigenvalues for any square matrix. The det != 0 does only apply for the A-λI matrix, if you want to find eigenvectors != the 0-vector.
• I don't understand why Det(λL-A) must equal 0 in order for (λL-a)V = 0 to be true.
• By definition of null space of matrix if vector v is not zero, column vectors of matrice are linearly dependent. Determinant for linearly dependent matrices are zero, singular matrice. Hope this helps.
• Why did sal only consider multiplying the identity matrix with v on the lamda side but not the v with A at ?
• Because he wants to put the vector v in matrix form using the Identity matrix and A is already in that form
• Could you add a video lesson on complex eigenvectors and complex eigenvalues?
• at , just have a silly question: are addition and subtraction between a matrix and a scalar undefined?
• Yes, addition and subtraction between a scalar and a matrix (or even between matrices of different dimensions) is undefined. That is why previous to adding, the scalar is multiplied by the Identity Matrix, so that at the time of the addition, you are adding two matrices of the same size.
• I didn't quite understand why λI_n-A must have linearly dependent columns so we can resolve (λI_n-1)v=0... can somebody help me?
• `(A-λI)v = 0` For simplicity, let's simply replace `A-λI` with B.
`Bv = 0` Given this equation, we know that all possible values of v is the nullspace of B. If v is an eigenvector, we also know that it needs to be non-zero. A non-zero eigenvector therefore means a non-trivial nullspace since v would have to be 0 for a trivial nullspace. A non-trivial nullspace means linearly dependent column vectors.