# Symmetric matrix

In linear algebra, a **symmetric matrix** is a square matrix that is equal to its transpose. Formally, matrix *A* is symmetric if

Because equal matrices have equal dimensions, only square matrices can be symmetric.

The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if the entries are written as *A* = (*a*_{ij}), then *a*_{ij} = *a*_{ji}, for all indices *i* and *j*.

The following 3 × 3 matrix is symmetric:

Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.

In linear algebra, a real symmetric matrix represents a self-adjoint operator^{[1]} over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.

## Properties

The sum and difference of two symmetric matrices is again symmetric, but this is not always true for the product: given symmetric matrices *A* and *B*, then *AB* is symmetric if and only if *A* and *B* commute, i.e., if *AB* = *BA*. So for integer *n*, *A ^{n}* is symmetric if

*A*is symmetric. If

*A*

^{−1}exists, it is symmetric if and only if

*A*is symmetric.

Let Mat_{n} denote the space of *n* × *n* matrices. A symmetric *n* × *n* matrix is determined by *n*(*n* + 1)/2 scalars (the number of entries on or above the main diagonal). Similarly, a skew-symmetric matrix is determined by *n*(*n* − 1)/2 scalars (the number of entries above the main diagonal). If Sym_{n} denotes the space of *n* × *n* symmetric matrices and Skew_{n} the space of *n* × *n* skew-symmetric matrices then Mat_{n} = Sym_{n} + Skew_{n} and Sym_{n} ∩ Skew_{n} = {0}, i.e.

where ⊕ denotes the direct sum. Let X ∈ Mat_{n} then

Notice that 1/2(*X* + *X*^{T}) ∈ Sym_{n} and 1/2(*X* − *X*^{T}) ∈ Skew_{n}. This is true for every square matrix *X* with entries from any field whose characteristic is different from 2.

Any matrix congruent to a symmetric matrix is again symmetric: if *X* is a symmetric matrix then so is *AXA*^{T} for any matrix *A*. A symmetric matrix is necessarily a normal matrix.

### Real symmetric matrices

Denote by
the standard inner product on **R**^{n}. The real *n*-by-*n* matrix *A* is symmetric if and only if

Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, in differential geometry, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert spaces.

The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every symmetric real matrix *A* there exists a real orthogonal matrix *Q* such that *D* = *Q*^{T}*AQ* is a diagonal matrix. Every symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix.

If *A* and *B* are *n* × *n* real symmetric matrices that commute, then they can be simultaneously diagonalized: there exists a basis of
such that every element of the basis is an eigenvector for both *A* and *B*.

Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the diagonal matrix *D* (above), and therefore *D* is uniquely determined by *A* up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.

### Complex symmetric matrices

A complex symmetric matrix can be 'diagonalized' using a unitary matrix: thus if *A* is a complex symmetric matrix, there is a unitary matrix *U* such that *U A U *^{T} is a real diagonal matrix. This result is referred to as the **Autonne–Takagi factorization**. It was originally proved by Léon Autonne (1915) and Teiji Takagi (1925) and rediscovered with different proofs by several other mathematicians.^{[2]}^{[3]} In fact, the matrix *B* = *A*^{†}*A* is Hermitian and non-negative, so there is a unitary matrix *V* such that *V*^{†}*BV* is diagonal with non-negative real entries. Thus *C* = *V*^{T}*AV* is complex symmetric with *C*^{†}*C* real. Writing *C* = *X* + *iY* with *X* and *Y* real symmetric matrices, *C*^{†}*C* = *X*^{2} + *Y*^{2} + *i*(*XY* − *YX*). Thus *XY* = *YX*. Since *X* and *Y* commute, there is a real orthogonal matrix *W* such that both *WXW*^{T} and *WYW*^{T} are diagonal. Setting *U* = *WV*^{T}, the matrix *UAU*^{T} is complex diagonal. Post-multiplying *U* by another diagonal matrix the diagonal entries can be made to be real and non-negative. Since their squares are the eigenvalues of *A*^{†}*A*, they coincide with the singular values of *A*. (Note, about the eigen-decomposition of a complex symmetric matrix *A*, the Jordan normal form of *A* may not be diagonal, therefore *A* may not be diagonalized by any similarity transformation.)

## Decomposition

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.^{[4]}

Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can also be factored, but not uniquely.

Cholesky decomposition states that every real positive-definite symmetric matrix *A* is a product of a lower-triangular matrix *L* and its transpose,
.
If the matrix is symmetric indefinite, it may be still decomposed as
where
is
a permutation matrix (arising from the need to pivot),
a lower unit triangular matrix, and
is a direct sum of symmetric 1 × 1 and 2 × 2 blocks, which is called Bunch-Kaufman decomposition ^{[5]}

A complex symmetric matrix may not be diagonalizable by similarity; every real symmetric matrix is diagonalizable by a real orthogonal similarity.

Every complex symmetric matrix *A* can be diagonalized by unitary congruence

where *Q* is a unitary matrix. If A is real, the matrix *Q* is a real orthogonal matrix, (the columns of which are eigenvectors of *A*), and *Λ* is real and diagonal (having the eigenvalues of *A* on the diagonal). To see orthogonality, suppose
and
are eigenvectors corresponding to distinct eigenvalues
,
. Then

Since and are distinct, we have .

## Hessian

Symmetric *n*-by-*n* matrices of real functions appear as the Hessians of twice continuously differentiable functions of *n* real variables.

Every quadratic form *q* on **R**^{n} can be uniquely written in the form *q*(**x**) = **x**^{T}*A***x** with a symmetric *n*-by-*n* matrix *A*. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of **R**^{n}, "looks like"

with real numbers *λ*_{i}. This considerably simplifies the study of quadratic forms, as well as the study of the level sets {**x** : *q*(**x**) = 1} which are generalizations of conic sections.

This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.

## Symmetrizable matrix

An *n*-by-*n* matrix *A* is said to be **symmetrizable** if there exists an invertible diagonal matrix *D* and symmetric matrix *S* such that *A* = *DS*.
The transpose of a symmetrizable matrix is symmetrizable, since *A*^{T} = *(DS)*^{T} = *SD* = *D*^{−1} (*DSD*) and *DSD* is symmetric. A matrix *A* = (*a _{ij}*) is symmetrizable if and only if the following conditions are met:

- implies for all
- for any finite sequence

## See also

Other types of symmetry or pattern in square matrices have special names; see for example:

See also symmetry in mathematics.

## Notes

- ↑ Jesús Rojo García (1986).
*Álgebra lineal (in Spanish)*(2^{nd.}ed.). Editorial AC. ISBN 84 7288 120 2. - ↑ Horn & Johnson 2013, p. 278
- ↑ See:
- Autonne, L. (1915), "Sur les matrices hypohermitiennes et sur les matrices unitaires",
*Ann. Univ. Lyon*,**38**: 1–77 - Takagi, T. (1925), "On an algebraic problem related to an analytic theorem of Carathéodory and Fejér and on an allied theorem of Landau",
*Japan. J. Math.*,**1**: 83–93 - Siegel, Carl Ludwig (1943), "Symplectic Geometry",
*American Journal of Mathematics*,**65**: 1–86, doi:10.2307/2371774, JSTOR 2371774 , Lemma 1, page 12 - Hua, L.-K. (1944), "On the theory of automorphic functions of a matrix variable I–geometric basis",
*Amer. J. Math.*,**66**: 470–488, doi:10.2307/2371910 - Schur, I. (1945), "Ein Satz über quadratische formen mit komplexen koeffizienten",
*Amer. J. Math.*,**67**: 472–480, doi:10.2307/2371974 - Benedetti, R.; Cragnolini, P. (1984), "On simultaneous diagonalization of one Hermitian and one symmetric form",
*Linear Algebra Appl.*,**57**: 215–226, doi:10.1016/0024-3795(84)90189-7

- Autonne, L. (1915), "Sur les matrices hypohermitiennes et sur les matrices unitaires",
- ↑ Bosch, A. J. (1986). "The factorization of a square matrix into two symmetric matrices".
*American Mathematical Monthly*.**93**(6): 462–464. doi:10.2307/2323471. JSTOR 2323471. - ↑ G.H. Golub, C.F. van Loan. (1996).
*Matrix Computations*. The Johns Hopkins University Press, Baltimore, London.

## References

- Horn, Roger A.; Johnson, Charles R. (2013),
*Matrix analysis*(2nd ed.), Cambridge University Press, ISBN 978-0-521-54823-6

## External links

- Hazewinkel, Michiel, ed. (2001) [1994], "Symmetric matrix",
*Encyclopedia of Mathematics*, Springer Science+Business Media B.V. / Kluwer Academic Publishers, ISBN 978-1-55608-010-4 - A brief introduction and proof of eigenvalue properties of the real symmetric matrix
- How to implement a Symmetric Matrix in C++