If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

### Course: Linear algebra>Unit 2

Lesson 7: Transpose of a matrix

# Transposes of sums and inverses

Transposes of sums and inverses. Created by Sal Khan.

## Want to join the conversation?

• Is the transpose of an invertible matrix always invertible? i couldn't make that out throughout the lesson
• Yes, because the determinant of a transpose is the same as the determinant of the original matrix.
• Hi,
I have a question that I'm not sure how to answer it.
What is the determinant of a 3x3 matrix A if A^t = -A
My first instinct is to say that normally det(A^t) = A but in this case
det(A^t) = det(-A) = -det(A)
Is this correct?
Thanks!
• On one hand, det(A^t)=det(A) (this is always true).
But since A^t=-A and A is 3x3, det(A^t)=det(-A)=(-1)^3det(A)=-det(A). So then
det(A)=-det(A), so det(A)=0.
• I'm not clear in.What is mean by A inverse?
• Suppose we have a square matrix, A, whose size is n. The inverse of A, denoted A^(-1), is another square matrix of size n such that A*A^(-1) = A^(-1)*A = I where I is the identity matrix with size n. Note that not all square matrices have an inverse.
(1 vote)
• what is the application of transpose in visual?
• Hello, I am new to learning how to transpose equations and was wondering if someone could suggest a good video as an into on how to transpose. Im looking for something very basic because I seem to be struggling with the concept. An example of something I would like to transpose would be an equation such a XL=2πFL Thank you.
(1 vote)
• You don't typically take the "transpose of an equation". It's just switching corresponding rows and columns in matrices.
• Vectors are sometimes represented as matrices. for example, [a]
[b]. If we take the transpose of that matrix it
would be [b a]. How would you represent it in the Cartesian system? Or is it undefined?
(1 vote)
• how can i see an example with transposing using algebra
like
(1 vote)
• So, at , Sal says that C^T = (A + B)^T = A^T + B^T. Coincidentally, this is also the first requirement for linearity, or proving that a transformation is linear. I think it's pretty obvious that (cA)^T = c(A)^T, so we can represent transposing a matrix as a linear transformation, and therefore a matrix product. I guess my question is, how do we construct the matrix that will transform any matrix A to its transpose A^T?
(1 vote)
• identify specific methods to move from the right side of the matrix to the left.
(1 vote)

## Video transcript

Let's see if we can prove to ourselves some more reasonably interesting transpose properties. So let's define some matrix C, that's equal to the sum of two other matrices, A and B. And so any entry in C, I can denote with a lowercase cij. So if I want the ith row in jth column it would be cij, and so each of its entries are going to be the sum of the corresponding columns that are matrices A and B. So our ij entry in C is going to be equal to the ij entry in A, plus the ij entry in B. That's our definition of matrix addition. You just get the corresponding entry in the same row and column, add them up, and you get your entry in the same row and column, and your new matrix is the sum of the other two. Now, let's think a little bit about the transposes of these guys right here. So, if A looks like this. I won't draw all of the entries. It takes forever. But each of its entries are ij, just like that. Let's say that A transpose looks like this. Each of its entries, we would call it, that's if you've got that same entry, we're going to call it a-prime ij. And these things aren't probably going to be the same. There's some chance they are, but they're probably not going to be the same. But that its ijth entry. In the ith row, jth column. In A transpose. Now, the fact that this is the transpose of that means that everything that's in some row and column here is going to be in that column and row over here, that the rows and columns get switched. So we know that we could write that a-prime ij, we're going to have the same entry that was in aji. Maybe aji is over here. aji is over here. So, this thing over here, which is in the same position as this one, is going to be equal to this guy over here if you switched the rows and columns. I think you can accept that. And you can make the same argument for B. Let me actually draw it out. So if I make B transpose. The entry in the ith row and jth column, I'll call it b-prime ij. Just like that. Just like I did for A. So we could say that b-prime ij is equal to, you take the matrix B, what's going to be the entry that's in the jth row and ith column. These are, you could almost say, the definition of the transpose. If I'm in the third row and second column now, it's going to be what was in the second row and third column. Fair enough. So we already have what cij is equal to. What's the transpose of cij going to be equal to? Let me write that down. So C transpose, let me write it over here. Write C transpose is equal to. I'll use the same notation. The prime means that we're taking entries in the transpose. So C transpose is just going to be a bunch of entries, ij. And I'll put a little prime there showing that that's entries in the matrix of the transpose, and not in C itself. And we know that c-prime ij is equal to cji. Nothing new at all. We've just expressed kind of the definition of the transpose for these three matrices. Now what is cji equal to? So let's focus on this a little bit. What is cji equal to? We know that cij is equal to a sub ij plus b sub ij, so if you swap them around, this is going to be equal to, you just swap the j's and the i's. a sub ji plus b sub ji. I just used this information here-- you could almost view it as this assumption or this definition-- to go from this to this. If I had an x and a y here, I'd have an x and a y here, and a x and a y here. I have a j and an i here, so I have a j and an i there, and a j and an i right there. Now what are these? What are these equal to? This is equal to. This guy right here is equal to-- we do it in the green-- the same entry for the transpose of a at ij. And this is equal to the same entry for the transpose of b at ij. Now, what is this telling us? It's telling us that the transpose of C, which is the same thing is A plus B, so it's saying that A plus B, A plus B transpose is the same thing as C transpose. Let me write that. C transpose is the same thing as A plus B transpose. So these are the entries in A plus B transpose right here. And what is this over here? What are these? These are the entries right there. We do the equal sign over here. What are these? These are the entries in A transpose plus B transpose. Right? These are the entries in A transpose. These are the entries in B transpose. If you take the sum of the two, you're just adding up the corresponding entries. So that's straightforward to show that if you take the sum of two matrices and then transpose it, it's equivalent to transposing them first, and then taking their sum. Which is a reasonably neat outcome. Let's do one more and I think we'll finish up all of our major transpose properties. Let's say that A inverse-- this is going to be a slightly different take on things. We're still going to take the transpose. So if we know that A inverse is the inverse of A, that means that A times A inverse is equal to the identity matrix, assuming that these are n-by-n matrices. So it's the n-dimensional identity matrix. And that A inverse times A is also going to be equal to the identity matrix. Now, let's take the transpose of both sides of this equation. I'll do them both simultaneously. So if you take the transpose of both sides of the equation, you get A times A inverse transpose is equal to the identity matrix transpose. And what's the transpose of the identity matrix? Let's draw it out. The identity matrix looks like this. You have just ones all the way down the diagonal and everything else is 0. Right, and you could view this as i 1, 1 i 2, 2 all the way down to i n, n. Everything else is 0. So when you take the transpose, you're just swapping out the zeroes, right? These guys don't change. The diagonal does not change when you take the transpose. So the transpose of the identity matrix is equal to the identity matrix. And so we can apply that same thing here. Let's take the transpose for this statement. So we know that A inverse times A transpose is equal to the identity matrix transpose, which is equal to the identity matrix. And then we know what happens when you take the transpose of a product. It's equal to the product of the transposes in reverse order. So this thing right here we can rewrite as A inverse transpose times A transpose, which is going to be equal to the identity matrix. You could do the same thing over here. This thing is going to be equal to A transpose times A inverse transpose, which is also going to be equal to the identity matrix. Now, this is an interesting statement. The fact that, if I have this guy right here, times the transpose of A is equal to the identity matrix, and the transpose of A times that same guy is equal to identity matrix, implies that A inverse transpose is the inverse of A transpose. Or another way of writing that is if I take A transpose, and if I take its inverse, that is going to be equal to this guy. It's going to be equal to A inverse transpose. So, another neat outcome dealing with transposes. If you take the inverse of the transpose, it's the same thing as the transpose of the inverse.