ELIAS EBNER

  • Home
  • Blog
  • Courses
  • About

Linear Algebra

  1. Vectors
    1. Scalars
    2. Vectors vs Sets
    3. Addition and Subtraction
    4. Scalar Multiplication
    5. Zero Vectors
    6. Linear Combinations
    7. Real Dot Product
    8. Length of a Vector
    9. Orthogonal Vectors
    10. Parallel Vectors
    11. Angle Between Vectors
    12. Unit Vectors
  2. Matrices
    1. Notation
    2. Indexing
    3. Submatrices
    4. Matrix-by-Vector Product
    5. Addition and Subtraction
    6. Scalar Multiplication
    7. Transpose
    8. Symmetries
    9. Matrix Multiplication
    10. Identity Matrix
    11. Non-Negative Integer Powers
    12. Reverse Order Law of Transposition
  3. Linear Systems
    1. Inverse Matrices
    2. Singular Matrices
    3. Linear Dependence
    4. Solutions
  4. Planes
    1. Vector Cross Product
  5. Gaussian Elimination
Linear Algebra ›Matrices ›Reverse Order Law of Transposition

Reverse Order Law for the Transposition of Matrices

We have covered matrix transposition in a previous lesson, but now that we know about matrix multiplication, we can learn about an important rule when transposing the product of two matrices.

Generally, the following holds for two conformable matrices AAA and BBB:

(AB)T=BTAT.(AB)^T = B^T A^T.(AB)T=BTAT.

As you can see, we had to swap the order of AAA and BBB in the multiplication.

Proof

If you’re interested, here is the proof for this.

Let nnn be the number of columns in matrix AAA and the number of rows in matrix BBB.

By the definition of matrix transposition and matrix multiplication, we get that for each row iii and column jjj

[(AB)T]ij=[AB]ji=Aj⋆B⋆i=∑k=1nAjkBki.\begin{align*} \left[ (AB)^T \right]_{ij} &= [AB]_{ji} &= A_{j \star} B_{\star i} &= \sum_{k = 1}^n A_{jk} B_{ki}. \end{align*}[(AB)T]ij​​=[AB]ji​​=Aj⋆​B⋆i​​=k=1∑n​Ajk​Bki​.​

Since AjkA_{jk}Ajk​ and BkiB_{ki}Bki​ are scalars, their multiplication is commutative, so

∑k=1nAjkBki=∑k=1nBkiAjk.\sum_{k = 1}^n A_{jk} B_{ki} = \sum_{k = 1}^n B_{ki} A_{jk}.k=1∑n​Ajk​Bki​=k=1∑n​Bki​Ajk​.

By the definition of matrix transposition we have that

∑k=1nBkiAjk=∑k=1n[BT]ik[AT]kj.\sum_{k = 1}^n B_{ki} A_{jk} = \sum_{k = 1}^n \left[ B^T \right]_{ik} \left[ A^T \right]_{kj}.k=1∑n​Bki​Ajk​=k=1∑n​[BT]ik​[AT]kj​.

Finally, by the definition of matrix multiplication we have that

∑k=1n[BT]ik[AT]kj=[BT]i⋆[AT]⋆j=[BTAT]ij.\begin{align*} \sum_{k = 1}^n \left[ B^T \right]_{ik} \left[ A^T \right]_{kj} &= \left[ B^T \right]_{i \star} \left[ A^T \right]_{\star j} \\ &= \left[ B^T A^T \right]_{ij}. \end{align*}k=1∑n​[BT]ik​[AT]kj​​=[BT]i⋆​[AT]⋆j​=[BTAT]ij​.​

We have just shown that [(AB)T]ij=[BTAT]ij\left[ (AB)^T \right]_{ij} = \left[ B^T A^T \right]_{ij}[(AB)T]ij​=[BTAT]ij​ for all rows iii and columns jjj. That must mean that it also holds for the entire matrix. That is,

(AB)T=BTAT.(AB)^T = B^T A^T.(AB)T=BTAT.

Thus, the transpose of a product always reverses the order of the factors.

Multiplying by the Transpose

From what we proved above, another interesting fact about matrices can be deduced.

For any matrix Am×nA_{m \times n}Am×n​, the products ATAA^T AATA and AATA A^TAAT are always symmetric matrices.

More formally, for any matrix AAA,

(ATA)T=ATA,(A^T A)^T = A^T A,(ATA)T=ATA,

and

(AAT)T=AAT.(A A^T)^T = A A^T.(AAT)T=AAT.

We can prove this using the reverse order law of transposition that we proved above. According to the reverse order law,

(ATA)T=AT(AT)T.(A^T A)^T = A^T (A^T)^T.(ATA)T=AT(AT)T.

Then,

AT(AT)T=ATA.A^T (A^T)^T = A^T A.AT(AT)T=ATA.

The same can be said for

(AAT)T=(AT)TAT=AAT.(A A^T)^T = (A^T)^T A^T = A A^T.(AAT)T=(AT)TAT=AAT.
Previous
Non-Negative Integer Powers of Matrices
Next
Linear Systems - Introduction
This website does not collect personal data, does not use cookies, and does not perform any tracking.