If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Transposes of sums and inverses

Transposes of sums and inverses. Created by Sal Khan.

Want to join the conversation?

Video transcript

Let's see if we can prove to ourselves some more reasonably interesting transpose properties. So let's define some matrix C, that's equal to the sum of two other matrices, A and B. And so any entry in C, I can denote with a lowercase cij. So if I want the ith row in jth column it would be cij, and so each of its entries are going to be the sum of the corresponding columns that are matrices A and B. So our ij entry in C is going to be equal to the ij entry in A, plus the ij entry in B. That's our definition of matrix addition. You just get the corresponding entry in the same row and column, add them up, and you get your entry in the same row and column, and your new matrix is the sum of the other two. Now, let's think a little bit about the transposes of these guys right here. So, if A looks like this. I won't draw all of the entries. It takes forever. But each of its entries are ij, just like that. Let's say that A transpose looks like this. Each of its entries, we would call it, that's if you've got that same entry, we're going to call it a-prime ij. And these things aren't probably going to be the same. There's some chance they are, but they're probably not going to be the same. But that its ijth entry. In the ith row, jth column. In A transpose. Now, the fact that this is the transpose of that means that everything that's in some row and column here is going to be in that column and row over here, that the rows and columns get switched. So we know that we could write that a-prime ij, we're going to have the same entry that was in aji. Maybe aji is over here. aji is over here. So, this thing over here, which is in the same position as this one, is going to be equal to this guy over here if you switched the rows and columns. I think you can accept that. And you can make the same argument for B. Let me actually draw it out. So if I make B transpose. The entry in the ith row and jth column, I'll call it b-prime ij. Just like that. Just like I did for A. So we could say that b-prime ij is equal to, you take the matrix B, what's going to be the entry that's in the jth row and ith column. These are, you could almost say, the definition of the transpose. If I'm in the third row and second column now, it's going to be what was in the second row and third column. Fair enough. So we already have what cij is equal to. What's the transpose of cij going to be equal to? Let me write that down. So C transpose, let me write it over here. Write C transpose is equal to. I'll use the same notation. The prime means that we're taking entries in the transpose. So C transpose is just going to be a bunch of entries, ij. And I'll put a little prime there showing that that's entries in the matrix of the transpose, and not in C itself. And we know that c-prime ij is equal to cji. Nothing new at all. We've just expressed kind of the definition of the transpose for these three matrices. Now what is cji equal to? So let's focus on this a little bit. What is cji equal to? We know that cij is equal to a sub ij plus b sub ij, so if you swap them around, this is going to be equal to, you just swap the j's and the i's. a sub ji plus b sub ji. I just used this information here-- you could almost view it as this assumption or this definition-- to go from this to this. If I had an x and a y here, I'd have an x and a y here, and a x and a y here. I have a j and an i here, so I have a j and an i there, and a j and an i right there. Now what are these? What are these equal to? This is equal to. This guy right here is equal to-- we do it in the green-- the same entry for the transpose of a at ij. And this is equal to the same entry for the transpose of b at ij. Now, what is this telling us? It's telling us that the transpose of C, which is the same thing is A plus B, so it's saying that A plus B, A plus B transpose is the same thing as C transpose. Let me write that. C transpose is the same thing as A plus B transpose. So these are the entries in A plus B transpose right here. And what is this over here? What are these? These are the entries right there. We do the equal sign over here. What are these? These are the entries in A transpose plus B transpose. Right? These are the entries in A transpose. These are the entries in B transpose. If you take the sum of the two, you're just adding up the corresponding entries. So that's straightforward to show that if you take the sum of two matrices and then transpose it, it's equivalent to transposing them first, and then taking their sum. Which is a reasonably neat outcome. Let's do one more and I think we'll finish up all of our major transpose properties. Let's say that A inverse-- this is going to be a slightly different take on things. We're still going to take the transpose. So if we know that A inverse is the inverse of A, that means that A times A inverse is equal to the identity matrix, assuming that these are n-by-n matrices. So it's the n-dimensional identity matrix. And that A inverse times A is also going to be equal to the identity matrix. Now, let's take the transpose of both sides of this equation. I'll do them both simultaneously. So if you take the transpose of both sides of the equation, you get A times A inverse transpose is equal to the identity matrix transpose. And what's the transpose of the identity matrix? Let's draw it out. The identity matrix looks like this. You have just ones all the way down the diagonal and everything else is 0. Right, and you could view this as i 1, 1 i 2, 2 all the way down to i n, n. Everything else is 0. So when you take the transpose, you're just swapping out the zeroes, right? These guys don't change. The diagonal does not change when you take the transpose. So the transpose of the identity matrix is equal to the identity matrix. And so we can apply that same thing here. Let's take the transpose for this statement. So we know that A inverse times A transpose is equal to the identity matrix transpose, which is equal to the identity matrix. And then we know what happens when you take the transpose of a product. It's equal to the product of the transposes in reverse order. So this thing right here we can rewrite as A inverse transpose times A transpose, which is going to be equal to the identity matrix. You could do the same thing over here. This thing is going to be equal to A transpose times A inverse transpose, which is also going to be equal to the identity matrix. Now, this is an interesting statement. The fact that, if I have this guy right here, times the transpose of A is equal to the identity matrix, and the transpose of A times that same guy is equal to identity matrix, implies that A inverse transpose is the inverse of A transpose. Or another way of writing that is if I take A transpose, and if I take its inverse, that is going to be equal to this guy. It's going to be equal to A inverse transpose. So, another neat outcome dealing with transposes. If you take the inverse of the transpose, it's the same thing as the transpose of the inverse.