In linear algebra, a **symmetric matrix** is a square matrix that is equal to its transpose. Let *A* be a symmetric matrix. Then:

The entries of a symmetric matrix are symmetric with respect to the main diagonal (top left to bottom right). So if the entries are written as *A* = (*a*_{ij}), then

for all indices *i* and *j*. The following 3×3 matrix is symmetric:

Every diagonal matrix is symmetric, since all off-diagonal entries are zero. Similarly, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.

In linear algebra, a real symmetric matrix represents a self-adjoint operator over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is generally assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.

## Contents |

The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every symmetric real matrix *A* there exists a real orthogonal matrix *Q* such that *D* = *Q*^{T}*AQ* is a diagonal matrix. Every symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix.

Another way of stating the real spectral theorem is that the eigenvectors of a symmetric matrix are orthogonal. More precisely, a matrix is symmetric if and only if it has an orthonormal basis of eigenvectors.

Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the above diagonal matrix *D*, and therefore *D* is uniquely determined by *A* up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.

The sum and difference of two symmetric matrices is again symmetric, but this is not always true for the product: given symmetric matrices *A* and *B*, then *AB* is symmetric if and only if *A* and *B* commute, i.e., if *AB* = *BA*. So for integer *n*, *A ^{n}* is symmetric if

If *A*^{−1} exists, it is symmetric if *A* is symmetric.

Let Mat_{n} denote the space of *n* × *n* matrices. A symmetric *n* × *n* matrix is determined by *n*(*n* + 1)/2 scalars (the number of entries on or above the main diagonal). Similarly, a skew-symmetric matrix is determined by *n*(*n* − 1)/2 scalars (the number of entries above the main diagonal). If Sym_{n} denotes the space of *n* × *n* symmetric matrices and Skew_{n} the space of *n* × *n* skew-symmetric matrices then since Mat_{n} = Sym_{n} + Skew_{n} and Sym_{n} ∩ Skew_{n} = {0}, i.e.

where ⊕ denotes the direct sum. Let X ∈ Mat_{n} then

Notice that ½(*X* + *X*^{T}) ∈ Sym_{n} and ½(*X* − *X*^{T}) ∈ Skew_{n}. This is true for every square matrix *X* with entries from any field whose characteristic is different from 2.

Any matrix congruent to a symmetric matrix is again symmetric: if *X* is a symmetric matrix then so is *AXA*^{T} for any matrix *A*.

Denote with the standard inner product on **R**^{n}. The real *n*-by-*n* matrix *A* is symmetric if and only if

Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, in differential geometry, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert spaces.

A symmetric matrix is a normal matrix.

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices. (Bosch, 1986)

Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can be also factored, but not uniquely.

Cholesky decomposition states that every real positive-definite symmetric matrix is a product of an upper-triangular matrix and its transpose.

Every real symmetric matrix *A* can be diagonalized, moreover the eigen decomposition takes a simpler form:

where *Q* is an orthogonal matrix (the columns of which are eigenvectors of *A*), and *Λ* is real and diagonal (having the eigenvalues of *A* on the diagonal).

Symmetric real *n*-by-*n* matrices appear as the Hessian of twice continuously differentiable functions of *n* real variables.

Every quadratic form *q* on **R**^{n} can be uniquely written in the form *q*(**x**) = **x**^{T}*A***x** with a symmetric *n*-by-*n* matrix *A*. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of **R**^{n}, "looks like"

with real numbers λ_{i}. This considerably simplifies the study of quadratic forms, as well as the study of the level sets {**x** : *q*(**x**) = 1} which are generalizations of conic sections.

This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.

An *n*-by-*n* matrix *A* is said to be **symmetrizable** if there exist an invertible diagonal matrix *D* and symmetric matrix *S* such that *A* = *DS*. The transpose of a symmetrizable matrix is symmetrizable, for (*DS*)^{T} = *D ^{−T}(D*

Other types of symmetry or pattern in square matrices have special names; see for example:

- Circulant matrix
- Hankel matrix
- Toeplitz matrix
- Centrosymmetric matrix
- Hilbert matrix
- Coxeter matrix
- Covariance matrix
- Antimetric matrix
- Skew-symmetric matrix

See also symmetry in mathematics.

- A. J. Bosch (1986). "The factorization of a square matrix into two symmetric matrices".
*American Mathematical Monthly***93**(6): 462–464. doi:10.2307/2323471. http://jstor.org/stable/2323471.