Complex random vector

In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If are complex-valued random variables, then the n-tuple is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts.

Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the mean of a complex random vector. Other concepts are unique to complex random vectors.

Applications of complex random vectors are found in digital signal processing.


A complex random vector   on the probability space   is a function   such that the vector   is a real real random vector on   where   denotes the real part of   and   denotes the imaginary part of  .[1]:p. 292

Cumulative distribution functionEdit

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form   make no sense. However expressions of the form   make sense. Therefore, the cumulative distribution function   of a random vector   is defined as







where  .


As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.[1]:p. 293







Covariance matrix and pseudo-covariance matrixEdit


The covariance matrix (also called second central moment)   contains the covariances between all pairs of components. The covariance matrix of an   random vector is an   matrix whose  th element is the covariance between the i th and the j th random variables.[2]:p.372 Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix.[1]:p. 293








The pseudo-covariance matrix (also called relation matrix) is defined as follows. In contrast to the covariance matrix defined above Hermitian transposition gets replaced by transposition in the definition.









The covariance matrix is a hermitian matrix, i.e.[1]:p. 293


The pseudo-covariance matrix is a symmetric matrix, i.e.


The covariance matrix is a positive semidefinite matrix, i.e.


Covariance matrices of real and imaginary partsEdit

By decomposing the random vector   into its real part   and imaginary part   (i.e.  ), the matrices   and   can be related to the covariance matrices of   and   via the following expressions:


and conversely


Cross-covariance matrix and pseudo-cross-covariance matrixEdit


The cross-covariance matrix between two complex random vectors   is defined as:








And the pseudo-cross-covariance matrix is defined as:









Two complex random vectors   and   are called uncorrelated if



Two complex random vectors   and   are called independent if







where   and   denote the cumulative distribution functions of   and   as defined in Eq.1 and   denotes their joint cumulative distribution function. Independence of   and   is often denoted by  . Written component-wise,   and   are called independent if


Circular symmetryEdit


A complex random vector   is called circularly symmetric if for every deterministic   the distribution of   equals the distribution of  .[3]:pp. 500–501


  • The expectation of a circularly symmetric complex random vector is either zero or it is not defined.[3]:p. 500
  • The pseudo-covariance matrix of a circularly symmetric complex random vector is zero.[3]:p. 584

Proper complex random vectorsEdit


A complex random vector   is called proper if the following three conditions are all satisfied:[1]:p. 293

  •   (zero mean)
  •   (all components have finite variance)

Two complex random vectors   are called jointly proper is the composite random vector   is proper.


  • A complex random vector   is proper if, and only if, for all (deterministic) vectors   the complex random variable   is proper.[1]:p. 293
  • Linear transformations of proper complex random vectors are proper, i.e. if   is a proper random vectors with   components and   is a deterministic   matrix, then the complex random vector   is also proper.[1]:p. 295
  • Every circularly symmetric complex random vector with finite variance of all its components is proper.[1]:p. 295
  • There are proper complex random vectors that are not circularly symmetric.[1]:p. 504
  • A real random vector is proper if and only if it is constant.
  • Two jointly proper complex random vectors are uncorrelated if and only if their covariace matrix is zero, i.e. if  .

Cauchy-Schwarz inequalityEdit

The Cauchy-Schwarz inequality for complex random vectors is


Characteristic functionEdit

The characteristic function of a complex random vector   with   components is a function   defined by:[1]:p. 295


See alsoEdit


  1. ^ a b c d e f g h i j Lapidoth, Amos (2009). A Foundation in Digital Communication. Cambridge University Press. ISBN 978-0-521-19395-5.
  2. ^ Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1.
  3. ^ a b c Tse, David (2005). Fundamentals of Wireless Communication. Cambridge University Press.