ELIAS EBNER

  • Home
  • Blog
  • Courses
  • About

Linear Algebra

  1. Vectors
    1. Scalars
    2. Vectors vs Sets
    3. Addition and Subtraction
    4. Scalar Multiplication
    5. Zero Vectors
    6. Linear Combinations
    7. Real Dot Product
    8. Length of a Vector
    9. Orthogonal Vectors
    10. Parallel Vectors
    11. Angle Between Vectors
    12. Unit Vectors
  2. Matrices
    1. Notation
    2. Indexing
    3. Submatrices
    4. Matrix-by-Vector Product
    5. Addition and Subtraction
    6. Scalar Multiplication
    7. Transpose
    8. Symmetries
    9. Matrix Multiplication
    10. Identity Matrix
    11. Non-Negative Integer Powers
    12. Reverse Order Law of Transposition
  3. Linear Systems
    1. Inverse Matrices
    2. Singular Matrices
    3. Linear Dependence
    4. Solutions
  4. Planes
    1. Vector Cross Product
  5. Gaussian Elimination
Linear Algebra โ€บMatrices โ€บSymmetries

Matrix Symmetries

Certain terminology is used to describe matrices depending on how the transpose operation modifies them.

Symmetry

A square matrix AAA is said to be symmetric when the transpose operation has no effect. In other words, a matrix is symmetric if and only if

AT=A. A^T = A.AT=A.

You can also say that a matrix Anร—nA_{n \times n}Anร—nโ€‹ is symmetric if and only if,

[A]ij=[A]ji,forย eachโ€…โ€Š1โ‰คi,jโ‰คn. [A]_{ij} = [A]_{ji}, \quad \text{for each} \; 1 \le i,j \le n.[A]ijโ€‹=[A]jiโ€‹,forย each1โ‰คi,jโ‰คn.

Example

Take the matrix

A=[193986364].A = \begin{bmatrix} 1 & 9 & 3 \\ 9 & 8 & 6 \\ 3 & 6 & 4 \end{bmatrix}.A=โ€‹193โ€‹986โ€‹364โ€‹โ€‹.

This matrix is symmetric, since

[193986364]T=[193986364].\begin{bmatrix} 1 & 9 & 3 \\ 9 & 8 & 6 \\ 3 & 6 & 4 \end{bmatrix}^T = \begin{bmatrix} 1 & 9 & 3 \\ 9 & 8 & 6 \\ 3 & 6 & 4 \end{bmatrix}.โ€‹193โ€‹986โ€‹364โ€‹โ€‹T=โ€‹193โ€‹986โ€‹364โ€‹โ€‹.

Now take the matrix

B=[123456789].B = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{bmatrix}.B=โ€‹147โ€‹258โ€‹369โ€‹โ€‹.

This matrix is not symmetric, since

[123456789]T=[147258369]โ‰ [123456789]\begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{bmatrix}^T = \begin{bmatrix} 1 & 4 & 7 \\ 2 & 5 & 8 \\ 3 & 6 & 9 \end{bmatrix} \ne \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{bmatrix}โ€‹147โ€‹258โ€‹369โ€‹โ€‹T=โ€‹123โ€‹456โ€‹789โ€‹โ€‹๎€ =โ€‹147โ€‹258โ€‹369โ€‹โ€‹

Skew-Symmetry

A square matrix AAA is said to be skew-symmetric when the transpose of a matrix equals its additive inverse. That is, a matrix is skew-symmetric if and only if

AT=โˆ’A.A^T = -A.AT=โˆ’A.

As we did above, the definition can also be expressed element-wise:

A matrix Anร—nA_{n \times n}Anร—nโ€‹ is said to be skew-symmetric if and only if

[A]ij=โˆ’[A]ji,forย eachโ€…โ€Š1โ‰คi,jโ‰คn. [A]_{ij} = -[A]_{ji}, \quad \text{for each} \; 1 \le i,j \le n.[A]ijโ€‹=โˆ’[A]jiโ€‹,forย each1โ‰คi,jโ‰คn.

Example

Take the matrix

A=[07โˆ’6โˆ’70โˆ’4640].A = \begin{bmatrix} 0 & 7 & -6 \\ -7 & 0 & -4 \\ 6 & 4 & 0 \end{bmatrix}.A=โ€‹0โˆ’76โ€‹704โ€‹โˆ’6โˆ’40โ€‹โ€‹.

This matrix is skew-symmetric, since

[07โˆ’6โˆ’70โˆ’4640]T=[0โˆ’76704โˆ’6โˆ’40]=โˆ’[07โˆ’6โˆ’70โˆ’4640]=โˆ’A.\begin{align*} \begin{bmatrix} 0 & 7 & -6 \\ -7 & 0 & -4 \\ 6 & 4 & 0 \end{bmatrix}^T &= \begin{bmatrix} 0 & -7 & 6 \\ 7 & 0 & 4 \\ -6 & -4 & 0 \end{bmatrix} \\ &= -\begin{bmatrix} 0 & 7 & -6 \\ -7 & 0 & -4 \\ 6 & 4 & 0 \end{bmatrix} = -A. \end{align*}โ€‹0โˆ’76โ€‹704โ€‹โˆ’6โˆ’40โ€‹โ€‹Tโ€‹=โ€‹07โˆ’6โ€‹โˆ’70โˆ’4โ€‹640โ€‹โ€‹=โˆ’โ€‹0โˆ’76โ€‹704โ€‹โˆ’6โˆ’40โ€‹โ€‹=โˆ’A.โ€‹

Notice how, for a matrix to be skew-symmetric, the elements along its diagonal must all be 000. That is because for the diagonal entries we have that

aii=โˆ’aii,a_{ii} = -a_{ii},aiiโ€‹=โˆ’aiiโ€‹,

and therefore

2aii=0aii=0\begin{align*} 2 a_{ii} &= 0 \\ a_{ii} &= 0 \end{align*}2aiiโ€‹aiiโ€‹โ€‹=0=0โ€‹

Diagonal Matrices

In this context I would like to introduce the concept of diagonal matrices. A diagonal matrix - which is always a square matrix - of the size nร—nn \times nnร—n, is a matrix of the form

D=[ฮป10โ‹ฏ00ฮป2โ‹ฏ0โ‹ฎโ‹ฎโ‹ฑโ‹ฎ00โ‹ฏฮปn].D = \begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end{bmatrix}.D=โ€‹ฮป1โ€‹0โ‹ฎ0โ€‹0ฮป2โ€‹โ‹ฎ0โ€‹โ‹ฏโ‹ฏโ‹ฑโ‹ฏโ€‹00โ‹ฎฮปnโ€‹โ€‹โ€‹.

Notice how each time I referred to symmetry in the context of matrices, what I meant was symmetric along the diagonal axis which goes from the top-left to the bottom-right. That is because the transpose operation โ€œmirrorsโ€ the matrix along this axis.

Therefore, a diagonal matrix as the one shown above is always symmetric, since it only has potential non-zero elements along this axis and all the other elements are 000. Keep in mind that (out of pure coincidence), it could be the case that ฮปi=0\lambda_i = 0ฮปiโ€‹=0 for some or all iii.

Therefore, the square zero matrix (0nร—n0_{n \times n}0nร—nโ€‹) is also a diagonal matrix. In fact, the square zero matrix is special because not only is it symmetric, it is also skew-symmetric. In fact, the zero matrix of size nร—nn \times nnร—n is the only kind of matrix which is both symmetric and skew-symmetric at the same time.

Final Remark

Notice how only square matrices can be symmetric or skew-symmetric. That is because obviously, when you transpose a rectangular matrix of shape mร—nm \times nmร—n, where mโ‰ nm \ne nm๎€ =n, you end up with a matrix of shape nร—mn \times mnร—m, so the matrices cannot be the same.

On the other hand, when you transpose a square matrix of shape nร—nn \times nnร—n, you still get a matrix of shape nร—nn \times nnร—n, so thereโ€™s at least a chance for them to be equal if the elements also match up.

Previous
Matrix Transpose
Next
Matrix Multiplication
This website does not collect personal data, does not use cookies, and does not perform any tracking.