We'll see that

Symmetric matrix

# Definition

A square matrix A = [aij] is said to be symmetric if for all i and all j :

aij = aji

Two elements that are symmetrical with respect to the first diagonal are equal : a symmetric matrix A is therefore equal to its transpose A'.

The family of symmetric matrices is particularly rich in "good" properties. In particular, the eigenvalues of a (real) symmetric matrix are real, and so are its eigenvectors. In addition, these eigenvectors form an orthonormal base (see below)..

# Symmetric matrices, Statistics and Modeling

Statistics and Data Modeling call on symmetric matrices on many occasions :

* A covariance matrix is symmetric by definition (and in addition is positive semidefinite, see below).

* A projection matrix (see below) projects the data space onto a linear subspace. Principal Components Analysis (PCA) and Linear Regression can both be interpreted in terms of projections onto subspaces, and therefore in terms of projection matrices.

* Ridge Regression can be interpreted as a modified Regression on Principal Components that operates on the spectral decomposition (see below) of the data covariance matrix.

* Establishing the properties of quadratic forms in normal random variables also relies on projection matrices.

# Eigenvalues and eigenvectors of a symmetric matrix

Symmetric matrices enjoy many important properties. In the Tutorial below, we'll establish some of these properties that will be needed in other places of this site.

## Eigenvalues and eigenvectors of a symmetric matrix are real

* The eigenvalues of a symmetric matrix are real. Recall that just because the coefficients of a matrix are real certainly does not imply that the eigenvalues of this matrix are real.

* A (real) eigenvalue may be associated to a complex eigenvector. But it is then always possible to find a real eigenvector associated to this eigenvalue. We'll consider only these real eigenvectors, and we'll therefore consider that the eigenvectors of a symmetric matrix are real.

## The eigenvectors of a symmetric matrix are orthogonal

We'll show that two eigenvectors associated to two distinct eigenvalues are orthogonal. A consequence is that if all the eigenvalues of the matrix are distinct, the normalized eigenvectors form an orthonormal reference frame.

This result remains true even when there are multiple eigenvalues, but it is then quite a bit more difficult, and we'll have to state it without proof.

-----

So, quite generally :

 The eigenvectors of a symmetric matrix form an orthonormal base.

# Spectral decomposition of a symmetric matrix

Denote U the square matrix of order p whose columns are the eigenvectors of a symmetric matrix A. We'll show that :

 A = UDU'

where D = diag(1, 2 , ... , n ) is the diagonal matrix of the eigenvalues of A.

This fundamental expression is called the spectral decomposition of the symmetric matrix A.

As the eigenvectors of A form an orthonormal base, the matrix U is an orthogonal matrix.

-----

Expanding the above expression leads to :

A = iiuiui'

where {ui} is the set of the eigenvectors of A.

Note the formal similarity with the expansion of the Singular Value Decomposition of a matrix.

We'll see that it can be interpreted in terms of projectors on the eigenvectors of A.

-----

This result is central for many questions pertaining to covariance matrices, in particular :

* In Principal Components Analysis,

* In Ridge Regression.

---------------------------------------------

Two sub-families of the family of symmetric matrices have properties and interpretations that are useful for the statistician :

* Projection matrices,

* And positive (semi-) definite matrices.

# Projection matrices

This illustration represents :

* A subspace S of the vector space E.

* A vector x and its orthogonal projection u on S.

The "orthogonal projection on S" is a linear operator, which can therefore be represented by a matrix. A matrix representing a projection operator is called a projection matrix, and it is always symmetric.

Because of their importance in Statistics, a special entry of this Glossary is dedicated to projection matrices.

# Positive (semi-) definite matrices

A symmetric matrix A is said to be positive semi-definite if, for any non 0 vector x :

 x'Ax 0

If this inequality is always strict, the matrix is said to be positive definite.

-----

Positive definite matrices play an important role in Statistics (essentially because covariance matrices are positive semidefinite), and an entry of this Glossary is dedicated to positive definite matrices.

________________________________________________________________

 Tutorial

In this Tutorial, we go over those properties of symmetric matrices that will be needed throughout this site for establishing important results about :

* Principal Components Analysis,

* Linear Regression (Simple and Multiple),

* Ridge Regression,

* Multivariate normal distribution,

* Quadratic forms in normal variables.

----

Only the general properties of symmetric matrices are addressed in this Tutorial.

* The properties of projection matrices are addressed here.

* The properties of positive definite matrices are addressed here.

SYMMETRIC MATRICES

 Eigenvalues and eigenvectors The eigenvalues are real Conjugate eigenvalues and eigenvectors The eigenvalues are real The eigenvectors are real The eigenvectors are orthogonal Spectral decomposition of a symmetric matrix Rank of a symmetric matrix TUTORIAL

_____________________________________________________

 Projection matrix Positive definite matrix Covariance matrix Principal Components Analysis Multiple Linear Regression Ridge Regression Multivariate normal distribution Singular Value Decomposition

 Want to contribute to this site ?