# Normal matrix

In mathematics, a complex square matrix A is normal if it commutes with its conjugate transpose A*:

${\displaystyle A{\text{ normal}}\quad \iff \quad A^{*}A=AA^{*}}$

The concept of normal matrices can be extended to normal operators on infinite dimensional normed spaces and to normal elements in C*-algebras. As in the matrix case, normality means commutativity is preserved, to the extent possible, in the noncommutative setting. This makes normal operators, and normal elements of C*-algebras, more amenable to analysis.

The spectral theorem states that a matrix is normal if and only if it is unitarily similar to a diagonal matrix, and therefore any matrix A satisfying the equation A*A = AA* is diagonalizable. The converse does not hold because diagonalizable matrices may have non-orthogonal eigenspaces.

The left and right singular vectors in the singular value decomposition of a normal matrix ${\displaystyle \mathbf {A} =\mathbf {U} {\boldsymbol {\Sigma }}\mathbf {V} ^{*}}$ differ only in complex phase from each other and from the corresponding eigenvectors, since the phase must be factored out of the eigenvalues to form singular values.

## Special cases

Among complex matrices, all unitary, Hermitian, and skew-Hermitian matrices are normal, with all eigenvalues being unit modulus, real, and imaginary, respectively. Likewise, among real matrices, all orthogonal, symmetric, and skew-symmetric matrices are normal, with all eigenvalues being complex conjugate pairs on the unit circle, real, and imaginary, respectively. However, it is not the case that all normal matrices are either unitary or (skew-)Hermitian, as their eigenvalues can be any complex number, in general. For example,

${\displaystyle A={\begin{bmatrix}1&1&0\\0&1&1\\1&0&1\end{bmatrix}}}$

is neither unitary, Hermitian, nor skew-Hermitian, because it's eigenvalues are ${\displaystyle 2,(1\pm i{\sqrt {3}})/2}$ ; yet it is normal because

${\displaystyle AA^{*}={\begin{bmatrix}2&1&1\\1&2&1\\1&1&2\end{bmatrix}}=A^{*}A.}$

## Consequences

Proposition: A normal triangular matrix is diagonal.
Proof: Let A be any normal upper triangular matrix. Since
${\displaystyle (A^{*}A)_{ii}=(AA^{*})_{ii},}$
using subscript notation, one can write the equivalent expression using instead the ith unit vector (${\displaystyle {\hat {\mathbf {e} }}_{i}}$ ) to select the ith row and ith column:
${\displaystyle {\hat {\mathbf {e} }}_{i}^{\intercal }\left(A^{*}A\right){\hat {\mathbf {e} }}_{i}={\hat {\mathbf {e} }}_{i}^{\intercal }\left(AA^{*}\right){\hat {\mathbf {e} }}_{i}.}$
The expression
${\displaystyle \left(A{\hat {\mathbf {e} }}_{i}\right)^{*}\left(A{\hat {\mathbf {e} }}_{i}\right)=\left(A^{*}{\hat {\mathbf {e} }}_{i}\right)^{*}\left(A^{*}{\hat {\mathbf {e} }}_{i}\right)}$
is equivalent, and so is
${\displaystyle \left\|A{\hat {\mathbf {e} }}_{i}\right\|^{2}=\left\|A^{*}{\hat {\mathbf {e} }}_{i}\right\|^{2},}$
which shows that the ith row must have the same norm as the ith column.
Consider i = 1. The first entry of row 1 and column 1 are the same, and the rest of column 1 is zero (because of triangularity). This implies the first row must be zero for entries 2 through n. Continuing this argument for row–column pairs 2 through n shows A is diagonal.

The concept of normality is important because normal matrices are precisely those to which the spectral theorem applies:

Proposition. A matrix A is normal if and only if there exists a diagonal matrix Λ and a unitary matrix U such that A = UΛU*.

The diagonal entries of Λ are the eigenvalues of A, and the columns of U are the eigenvectors of A. The matching eigenvalues in Λ come in the same order as the eigenvectors are ordered as columns of U.

Another way of stating the spectral theorem is to say that normal matrices are precisely those matrices that can be represented by a diagonal matrix with respect to a properly chosen orthonormal basis of Cn. Phrased differently: a matrix is normal if and only if its eigenspaces span Cn and are pairwise orthogonal with respect to the standard inner product of Cn.

The spectral theorem for normal matrices is a special case of the more general Schur decomposition which holds for all square matrices. Let A be a square matrix. Then by Schur decomposition it is unitary similar to an upper-triangular matrix, say, B. If A is normal, so is B. But then B must be diagonal, for, as noted above, a normal upper-triangular matrix is diagonal.

The spectral theorem permits the classification of normal matrices in terms of their spectra, for example:

Proposition. A normal matrix is unitary if and only if all of its eigenvalues (its spectrum) lie on the unit circle of the complex plane.
Proposition. A normal matrix is self-adjoint if and only if its spectrum is contained in ${\displaystyle \mathbb {R} }$ . In other words: A normal matrix is Hermitian if and only if all its eigenvalues are real.

In general, the sum or product of two normal matrices need not be normal. However, the following holds:

Proposition. If A and B are normal with AB = BA, then both AB and A + B are also normal. Furthermore there exists a unitary matrix U such that UAU* and UBU* are diagonal matrices. In other words A and B are simultaneously diagonalizable.

In this special case, the columns of U* are eigenvectors of both A and B and form an orthonormal basis in Cn. This follows by combining the theorems that, over an algebraically closed field, commuting matrices are simultaneously triangularizable and a normal matrix is diagonalizable – the added result is that these can both be done simultaneously.

## Equivalent definitions

It is possible to give a fairly long list of equivalent definitions of a normal matrix. Let A be a n × n complex matrix. Then the following are equivalent:

1. A is normal.
2. A is diagonalizable by a unitary matrix.
3. There exists a set of eigenvectors of A which forms an orthonormal basis for Cn.
4. ${\displaystyle \left\|A\mathbf {x} \right\|=\left\|A^{*}\mathbf {x} \right\|}$  for every x.
5. The Frobenius norm of A can be computed by the eigenvalues of A: ${\textstyle \operatorname {tr} \left(A^{*}A\right)=\sum _{j}\left|\lambda _{j}\right|^{2}}$ .
6. The Hermitian part 1/2(A + A*) and skew-Hermitian part 1/2(AA*) of A commute.
7. A* is a polynomial (of degree n − 1) in A.[a]
8. A* = AU for some unitary matrix U.[1]
9. U and P commute, where we have the polar decomposition A = UP with a unitary matrix U and some positive semidefinite matrix P.
10. A commutes with some normal matrix N with distinct eigenvalues.
11. σi = |λi| for all 1 ≤ in where A has singular values σ1 ≥ ⋯ ≥ σn and eigenvalues |λ1| ≥ ⋯ ≥ |λn|.[2]
12. ${\displaystyle A=B+iC}$  for two self-adjoint matrices B and C.

Some but not all of the above generalize to normal operators on infinite-dimensional Hilbert spaces. For example, a bounded operator satisfying (9) is only quasinormal.

## Normal matrix analogy

It is occasionally useful (but sometimes misleading) to think of the relationships of special kinds of normal matrices as analogous to the relationships of the corresponding type of complex numbers of which their eigenvalues are composed. This is because any function of a non-defective matrix acts directly on each of its eigenvalues, and the conjugate transpose of its spectral decomposition ${\displaystyle VDV^{*}}$  is ${\displaystyle VD^{*}V^{*}}$ , where ${\displaystyle D}$  is the diagonal matrix of eigenvalues. Likewise, if two normal matrices commute and are therefore simultaneously diagonalizable, any operation between these matrices also acts on each corresponding pair of eigenvalues.

As a special case, the complex numbers may be embedded in the normal 2×2 real matrices by the mapping

${\displaystyle a+bi\mapsto {\begin{bmatrix}a&b\\-b&a\end{bmatrix}}=a\,{\begin{bmatrix}1&0\\0&1\end{bmatrix}}+b\,{\begin{bmatrix}0&1\\-1&0\end{bmatrix}}\,.}$

which preserves addition and multiplication. It is easy to check that this embedding respects all of the above analogies.

## Notes

1. ^ Proof. When ${\displaystyle A}$  is normal, use Lagrange's interpolation formula to construct a polynomial ${\displaystyle P}$  such that ${\displaystyle {\overline {\lambda _{j}}}=P(\lambda _{j})}$ , where ${\displaystyle \lambda _{j}}$  are the eigenvalues of ${\displaystyle A}$ .

## Sources

• Horn, Roger Alan; Johnson, Charles Royal (1985), Matrix Analysis, Cambridge University Press, ISBN 978-0-521-38632-6.
• Horn, Roger Alan; Johnson, Charles Royal (1991). Topics in Matrix Analysis. Cambridge University Press. ISBN 978-0-521-30587-7.