If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Null space 3: Relation to linear independence

Understanding how the null space of a matrix relates to the linear independence of its column vectors. Created by Sal Khan.

Want to join the conversation?

  • blobby green style avatar for user leidenschaft1
    at , we have an nxn matrix, whereas we start with matrix A that is mxn. Khan mentions that this has to be nxn, but I don't see why this A has transformed into nxn matrix and it has to be nxn.
    (30 votes)
    Default Khan Academy avatar avatar for user
    • leaf green style avatar for user Mike Bergen
      I think Sal got it wrong here. There could also be the case where m>n. But this would require rref(A) to have all rows below the nth row to be all zero. In this case the row vectors would be linearly dependent but the column vectors would be linearly independent (their span would be a subspace of R^m) and N(A)={0}

      Response to other answers: A square matrix is the requirement for A BASIS. Linear Indepedance does not require a square matrix. So in a RREF matrix you can add rows of zeros because the columns remain linearly independent. In the nxn case the linearly independant columns span R^n (ie a basis) and in the mxn case (m>n) the linerly independant columns span a SUBSPACE of R^m. (not a basis of R^m). In both cases the null space has 0 as the unique solution.
      (41 votes)
  • leaf green style avatar for user Goutham Gopalakrishna
    Can someone explain me the physical meaning of null space? I understand it mathematically but don't get it in a practical sense. Thanks.
    (12 votes)
    Default Khan Academy avatar avatar for user
    • leaf green style avatar for user Gobot
      A 'physical meaning' would come from the meaning of the numbers in the matrix. Look up using matrices to solve Kirchhoff’s Current Law for one example where the nullspace is used for an applied problem. Otherwise, if you just want some intuition: the nullspace for a matrix A is the set of vectors that are perpendicular to all the rows of A.
      (34 votes)
  • blobby green style avatar for user Rossana Isola
    all that means that rref(A) = I , the identity matrix right?
    (7 votes)
    Default Khan Academy avatar avatar for user
  • orange juice squid orange style avatar for user Rodrigo Alves
    Sal explains that the only way to the matrix vectors to be all linearly independent is if none of them is (may be represented as) a combination of the others. In which case the only solution is 0.

    Then he says that for A.x = 0 to be true, x must be the zero vector. I guess I can make a conclusion by now, that the only way to satisfy the linear independency is if x = 0.

    Is that wrong? Have I made any mistake in this thought?
    (9 votes)
    Default Khan Academy avatar avatar for user
    • male robot donald style avatar for user Jeremy
      Rodrigo,

      You haven't made a mistake, this is correct.

      Basically Ax is the same as the ith column of A times the ith term in x (or each column of A times it's respective x term). If the columns of A are a linearly independent set, then the only way to multiply them all by some coefficients, and then add them all together and STILL get zero is if all of the coefficients are zero. Well in this case, the terms of x act like the coefficients of the columns of A. For the whole thing to be equal to the zero vector, all of the x terms must be 0. Or the "nullspace of A" is ONLY the zero vector.
      (10 votes)
  • leafers seed style avatar for user Zulfidin Hojaev
    what does nullspace mean, physically and mathematically ?
    (3 votes)
    Default Khan Academy avatar avatar for user
    • aqualine ultimate style avatar for user Kyler Kathan
      If A x⃑ = 0⃑
      Then N(A) = x⃑
      And vis-versa.

      The nullspace of a matrix gives you a subspace.
      N(A) = x⃑ = V
      Every single value within that subspace will become the zero vector when transformed by A (which is what the original equation means).

      If N(A) ≠ 0⃑
      Then A isn't invertible.

      Hope this gives you some understanding.
      (8 votes)
  • winston baby style avatar for user Martin McEnroe
    A Salman must have converted a matrix into a vector of vectors. This is confusing since I had always thought of matrices as composed of real numbers. In computer languages, such as Python there is a distinction between an "array" and a "list". A list can be a list of lists. But an array cannot be an array of arrays, well at least as far as I know. Can someone comment on what Salman did here?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user LeRoy Sandoval Jr.
    So what is the actual purpose of null space? What does it signify?
    (3 votes)
    Default Khan Academy avatar avatar for user
    • piceratops tree style avatar for user Tre Giles
      The nullspace is the set of all vectors v that, when multiplied by some matrix in the form Av, the result is the zero-vector. This is useful because it tells you every single vector that, when multiplied with A, will result in 0. That's why they call it the "null" space, or the space of vectors that will "nullify" or "zero out" a matrix. I've seen it used in low-level UX design (opengl) and in games where 3d vector manipulation is baked in (like Second Life). Also note the exciting idea that the null space represents the set of all vectors that are orthogonal to the matrix row vectors (dot product = 0).
      (5 votes)
  • blobby green style avatar for user Luke
    At Sal starts talking about Linearly Independence.
    What if an N by N matrix wasn't Linearly Independent?
    (1 vote)
    Default Khan Academy avatar avatar for user
    • old spice man green style avatar for user newbarker
      Then the zero vector could be obtained by choosing an x not equal to zero in the equation Ax = 0. For instance, if in R^3, you had a 3x3 matrix A that could be multiplied by a vector x (where x isn't [0,0,0]) and the product was the zero vector ([0,0,0]), then the null space of A isn't trivial* which implies that the columns of A (i.e. the 3 R^3 vectors) are linearly dependent. This means that one of the vectors could be written as a combination of the other two.

      In essence, if the null space is JUST the zero vector, the columns of the matrix are linearly independent. If the null space has more than the zero vector, the columns of the matrix are linearly dependent.

      * trivial null space is just the zero vector.
      (7 votes)
  • mr pink red style avatar for user vanshreebhalotia
    why does rref(A) have to be a SQUARE matrix?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user InnocentRealist
    Is an mxn matrix a tensor? A vector of vectors (m R^n row vectors or n R^m column vectors)?
    (2 votes)
    Default Khan Academy avatar avatar for user

Video transcript

- [Voiceover] So I have the matrix A over here, and A has m rows and n columns, so we could call this an m by n matrix. And what I want to do in this video, is relate the linear independence, or linear dependence, of the column vectors of A, to the null space of A. So what, first of all what I am talking about as column vectors? Well as you can see there's n columns here, and we could view each of those as an m-dimensional vector. And so, let me do it this way, so you can view this one right over here, we could write that as V one, V one, this next one over here, this would be V two, V two, and you would have n of these, because we have n columns, and so this one right over here would be V n, V sub n. And so we could rewrite A, we could rewrite the matrix A, the m by n matrix A, I'm bolding it to show that that's a matrix, we could rewrite it as, so let me do it the same way, so, draw my little brackets there, we can write it, just express it, in terms of its column vectors, cos we could just say well this is going to be V one for that column, V one for that column, V two for this column, all the way, we're gonna have n columns, so you're gonna have V n, for the nth column. And remember, each of these, are going to have m terms, or I should say, m components in them. These are m-dimensional column vectors. Now what I want to do, I said I want to relate the linear independence of these vectors, to the null space of A. So let's remind ourselves what the null space of A even is. So the null space of A, the null space of A, is equal to, or I could say it's equal to the set, it's the set of all vectors x, that are members of R n, and I'm gonna double down on why I'm saying R n, in a second, such that, such that, if I take my matrix A, if I take my matrix A, and multiply it by one of those x's, by one of those x's, I'm going to get, I'm going to get the zero vector. So, why do the, why does x have to be a member of R n? Well just for the matrix multiplication to work, for this to be, if this is m by n, let me write this down, if this is m by n, well in order just to make the matrix multiplication work or you could say the matrix vector multiplication, this has to be an n by one, an n by one, vector, and so it's gonna have n components, so it's gonna be a member of R n. If this was m by A, well, or, let me use a different letter, if this was m by, I don't know, seven, then this would be R seven, that we would be dealing with. So that is the null space. So, another way of thinking about it is, well if I take my matrix A, and I multiply it by sum vector x, that's a member of this null space, I'm going to get the zero vector. So if I take my matrix A, which I've expressed here in terms of its column vectors, multiply it by sum vector x, so sum vector x, and, actually let me make it clear that, it doesn't have to have the same, so, sum vector x right over here, we draw the other bracket, so this is the vector x, and so, it's going to have, it's a member of R n, so it's going to have n components, you're gonna have x one, as the first component, x two, and go all the way, to x n. If you multiply, so if we say that this x is a member of the null space of A, then, this whole thing is going to be equal to the zero vector, is going to be equal to the zero vector, and once again the zero vector, this is gonna be an m by one vector, so it's gonna look, actually let me write it like this, it's gonna have the same number of rows as A, so, I'll try to make it, the brackets roughly the same length, so, and there we go, try and draw my brackets neatly, so you're gonna have m of these, one, two, and then go all the way to the mth zero. So, let's actually just multiply this out, using what we know of matrix multiplication. And by the definition of matrix multiplication, one way to view this, if you were to multiply our matrix A times our vector x here, you're going to get the first column vector, V one, V one, times the first component here, x one, x one, plus, the second component times the second column vector, x two times V two, V two, and we're gonna do that n times, so plus dot dot dot x sub n times V sub n, V sub n, and these all when you add them together are going to be equal to the zero vector. Now this should be, this, so it's gonna be equal to the zero vector, and now this should start ringing a bell to you, when we looked at, when we looked at linear independence we saw something like this, in fact we saw that these vectors V, V sub one, V sub two, these n vectors, are linearly independent if and only if, any linear, if and only if the solution to this, or I guess you could say the weights on these vectors, the only way to get this to be true is if x one, x two, x n are all equal zero. So let me write this down. So V sub one, V sub two, all the way to V sub n, are linearly independent, linearly independent, if and only if, if and only if, only solution, so let me, only solution, or you could say weights on these vectors, to this equation, only solution is x one, x two, all the way to x n are equal to zero. So if the only solution here, if the only way to get this sum to be equal to the zero vector, is if x one, x two and x, all the way through x n, are equal to zero, well that means that our vectors V one, V two, all the way to V n, are linearly independent, or vice versa, if they're linearly independent, then the only solution to this, if we're solving for the weights on those vectors, is if for x one, x two and x n to be equal to zero. Remember, linear independence, if you want to say, it's still mathematical, but a little bit more, common language is, if these vectors are linearly independent, that means that none of these vectors can be constructed by linear combinations of the other vectors, or, looking at it this way, this right over here is a, you could view this as a linear combination of all of the vectors, that the only way to get this linear combination of all the vectors to be equal to zero, is if x one, x two, all the way through x n are equal to zero, and we proved that in other videos on linear independence. Well, if the only solution to this is all of the x one's through x n's are equal to zero, that means that the null space, this is only going to be true, you could say, if and only if, the null space of A, the null space of A, let me make sure it looks like a matrix, I'm gonna bold it, the null space of A only contains one vector, it only contains the zero vector. Remember, this is, if all of these are gonna be zero, well then the only solution here is gonna be the zero vector, is going to be, is going to be the zero vector. So the result that we're showing here is, if the column vectors of a matrix are linearly independent, then the null space of that matrix is only going to consist of the zero vector. Or you could go the other way. If the null space of a matrix only contains the zero vector, well that means that the columns of that matrix are linearly independent.