# Cross-covariance matrix

In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.

The cross-covariance matrix of two random vectors $\mathbf {X}$ and $\mathbf {Y}$ is typically denoted by $\operatorname {K} _{\mathbf {X} \mathbf {Y} }$ or $\Sigma _{\mathbf {X} \mathbf {Y} }$ .

## Definition

For random vectors $\mathbf {X}$  and $\mathbf {Y}$ , each containing random elements whose expected value and variance exist, the cross-covariance matrix of $\mathbf {X}$  and $\mathbf {Y}$  is defined by:p.336

$\operatorname {K} _{\mathbf {X} \mathbf {Y} }=\operatorname {cov} (\mathbf {X} ,\mathbf {Y} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {X} -\mathbf {\mu _{X}} )(\mathbf {Y} -\mathbf {\mu _{Y}} )^{\rm {T}}]$

(Eq.1)

where $\mathbf {\mu _{X}} =\operatorname {E} [\mathbf {X} ]$  and $\mathbf {\mu _{Y}} =\operatorname {E} [\mathbf {Y} ]$  are vectors containing the expected values of $\mathbf {X}$  and $\mathbf {Y}$ . The vectors $\mathbf {X}$  and $\mathbf {Y}$  need not have the same dimension, and either might be a scalar value.

The cross-covariance matrix is the matrix whose $(i,j)$  entry is the covariance

$\operatorname {K} _{X_{i}Y_{j}}=\operatorname {cov} [X_{i},Y_{j}]=\operatorname {E} [(X_{i}-\operatorname {E} [X_{i}])(Y_{j}-\operatorname {E} [Y_{j}])]$

between the i-th element of $\mathbf {X}$  and the j-th element of $\mathbf {Y}$ . This gives the following component-wise definition of the cross-covariance matrix.

$\operatorname {K} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{1}-\operatorname {E} [X_{1}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{2}-\operatorname {E} [X_{2}])(Y_{n}-\operatorname {E} [Y_{n}])]\\\\\vdots &\vdots &\ddots &\vdots \\\\\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{1}-\operatorname {E} [Y_{1}])]&\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{2}-\operatorname {E} [Y_{2}])]&\cdots &\mathrm {E} [(X_{m}-\operatorname {E} [X_{m}])(Y_{n}-\operatorname {E} [Y_{n}])]\end{bmatrix}}$

## Example

For example, if $\mathbf {X} =\left(X_{1},X_{2},X_{3}\right)^{\rm {T}}$  and $\mathbf {Y} =\left(Y_{1},Y_{2}\right)^{\rm {T}}$  are random vectors, then $\operatorname {cov} (\mathbf {X} ,\mathbf {Y} )$  is a $3\times 2$  matrix whose $(i,j)$ -th entry is $\operatorname {cov} (X_{i},Y_{j})$ .

## Properties

For the cross-covariance matrix, the following basic properties apply:

1. $\operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]-\mathbf {\mu _{X}} \mathbf {\mu _{Y}} ^{\rm {T}}$
2. $\operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {cov} (\mathbf {Y} ,\mathbf {X} )^{\rm {T}}$
3. $\operatorname {cov} (\mathbf {X_{1}} +\mathbf {X_{2}} ,\mathbf {Y} )=\operatorname {cov} (\mathbf {X_{1}} ,\mathbf {Y} )+\operatorname {cov} (\mathbf {X_{2}} ,\mathbf {Y} )$
4. $\operatorname {cov} (A\mathbf {X} +\mathbf {a} ,B^{\rm {T}}\mathbf {Y} +\mathbf {b} )=A\,\operatorname {cov} (\mathbf {X} ,\mathbf {Y} )\,B$
5. If $\mathbf {X}$  and $\mathbf {Y}$  are independent (or somewhat less restrictedly, if every random variable in $\mathbf {X}$  is uncorrelated with every random variable in $\mathbf {Y}$ ), then $\operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=0_{p\times q}$

where $\mathbf {X}$ , $\mathbf {X_{1}}$  and $\mathbf {X_{2}}$  are random $p\times 1$  vectors, $\mathbf {Y}$  is a random $q\times 1$  vector, $\mathbf {a}$  is a $q\times 1$  vector, $\mathbf {b}$  is a $p\times 1$  vector, $A$  and $B$  are $q\times p$  matrices of constants, and $0_{p\times q}$  is a $p\times q$  matrix of zeroes.

## Definition for complex random vectors

If $\mathbf {Z}$  and $\mathbf {W}$  are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by Hermitian transposition:

$\operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,\mathbf {W} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {H}}]$

For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows:

$\operatorname {J} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,{\overline {\mathbf {W} }}){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {T}}]$

## Uncorrelatedness

Two random vectors $\mathbf {X}$  and $\mathbf {Y}$  are called uncorrelated if their cross-covariance matrix $\operatorname {K} _{\mathbf {X} \mathbf {Y} }$  matrix is a zero matrix.:p.337

Complex random vectors $\mathbf {Z}$  and $\mathbf {W}$  are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if $\operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {J} _{\mathbf {Z} \mathbf {W} }=0$ .