Vandermonde matrix

In linear algebra, a Vandermonde matrix, named after Alexandre-Théophile Vandermonde, is a matrix with the terms of a geometric progression in each row: an m × n matrix

or

for all indices i and j.[1] Some authors define the Vandermonde matrix as the transpose of the above matrix.[2]

The determinant of a square Vandermonde matrix is called a Vandermonde polynomial or Vandermonde determinant. Its value is the polynomial

which is non-zero if and only if all are distinct.

The Vandermonde determinant was sometimes called the discriminant, although, presently, the discriminant of a polynomial is the square of the Vandermonde determinant of the roots of the polynomial. The Vandermonde determinant is an alternating form in the , meaning that exchanging two changes the sign, while permuting the by an even permutation does not change the value of the determinant. It thus depends on the choice of an order for the , while its square, the discriminant, does not depend on any order, and this implies, by Galois theory, that the discriminant is a polynomial function of the coefficients of the polynomial that has the as roots.

ProofsEdit

The main property of a square Vandermonde matrix

 

is that its determinant has the simple form

 

Three proofs of this equality are given below. The first one uses polynomial properties, especially the unique factorization property of multivariate polynomials. Although conceptually simple, it involves non-elementary concepts of abstract algebra. The second proof does not require any explicit computation, but involves the concepts of the determinant of a linear map and change of basis. It provides also the structure of the LU decomposition of the Vandermonde matrix. The third one is more elementary and more complicated, using only elementary row and column operations.

Using polynomial propertiesEdit

By the Leibniz formula, det(V) is a polynomial in the   with integer coefficients. All entries of the ith column have total degree i – 1. Thus, again by the Leibniz formula, all terms of the determinant have total degree

 

(that is the determinant is a homogeneous polynomial of this degree).

If, for ij, one substitutes   for  , one gets a matrix with two equal rows, which has thus a zero determinant. Thus, by the factor theorem,   is a divisor of det(V). By the unique factorization property of multivariate polynomials, the product of all   divides det(V), that is

 

where Q is a polynomial. As the product of all   and det(V) have the same degree   the polynomial Q is, in fact, a constant. This constant is one, because the product of the diagonal entries of V is   which is also the monomial that is obtained by taking the first term of all factors in   This proves that

 

Using linear mapsEdit

Let F be a field containing all   and   the F vector space of the polynomials of degree less than n with coefficients in F. Let

 

be the linear map defined by

 

The Vandermonde matrix is the matrix of   with respect to the canonical bases of   and  

Changing the basis of   amounts to multiplying the Vandermonde matrix by a change-of-basis matrix M (from the right). This does not change the determinant, if the determinant of M is 1.

The polynomials  ,  ,  , …,   are monic of respective degrees 0, 1, …, n – 1. Their matrix on the monomial basis is an upper-triangular matrix U (if the monomials are ordered in increasing degrees), with all diagonal entries equal to one. This matrix is thus a change-of-basis matrix of determinant one. The matrix of   on this new basis is

 

Thus Vandermonde determinant equals the determinant of this matrix, which is the product of its diagonal entries.

This proves the desired equality. Moreover, one gets the LU decomposition of V as

 

By row and column operationsEdit

This third proof is based on the fact that if one adds to a column of a matrix the product by a scalar of another column then the determinant remains unchanged.

So, by subtracting to each column – except the first one – the preceding column multiplied by   the determinant is not changed. (These subtractions must be done by starting from last columns, for subtracting a column that has not yet been changed). This gives the matrix

 

Applying the Laplace expansion formula along the first row, we obtain  , with

 

As all the entries in the  -th row of   have a factor of   one can take these factors out and obtain

 

where   is a Vandermonde matrix in   Iterating this process on this smaller Vandermonde matrix, one eventually gets the desired expression of det(V) as the product of all   such that i < j.

Resulting propertiesEdit

An m × n rectangular Vandermonde matrix such that mn has maximum rank m if and only if all xi are distinct.

An m × n rectangular Vandermonde matrix such that mn has maximum rank n if and only if there are n of the xi that are distinct.

A square Vandermonde matrix is invertible if and only if the xi are distinct. An explicit formula for the inverse is known.[3][2][4]

ApplicationsEdit

The Vandermonde matrix evaluates a polynomial at a set of points; formally, it is the matrix of the linear map that maps the vector of coefficients of a polynomial to the vector of the values of the polynomial at the values appearing in the Vandermonde matrix. The non-vanishing of the Vandermonde determinant for distinct points   shows that, for distinct points, the map from coefficients to values at those points is a one-to-one correspondence, and thus that the polynomial interpolation problem is solvable with a unique solution; this result is called the unisolvence theorem, and is a special case of the Chinese remainder theorem for polynomials.

This may be useful in polynomial interpolation, since inverting the Vandermonde matrix allows expressing the coefficients of the polynomial in terms of the  [5] and the values of the polynomial at the  . However, the interpolation polynomial is generally easier to compute with the Lagrange interpolation formula,[6] which may also be used for deriving a formula for the inverse of a Vandermonde matrix.[7][better source needed]

The Vandermonde determinant is used in the representation theory of the symmetric group.[8]

When the values   belong to a finite field, then the Vandermonde determinant is also called a Moore determinant and has specific properties that are used, for example, in the theory of BCH code and Reed–Solomon error correction codes.

The discrete Fourier transform is defined by a specific Vandermonde matrix, the DFT matrix, where the numbers αi are chosen to be roots of unity. Using the Fast Fourier Transform it is possible to compute the product of a Vandermonde matrix with a vector in   time.[9]

The Laughlin wavefunction with filling factor one (appearing in the Quantum Hall effect), by the formula for the Vandermonde determinant, can be seen to be a Slater determinant. This is not true anymore for filling factors different from one, i.e., in the fractional Quantum Hall effect.

It is the design matrix of polynomial regression.

It is the normalized volume of arbitrary  -faces of cyclic polytopes. Specifically, if   is a  -face of the cyclic polytope   (where  ), then

 

Confluent Vandermonde matricesEdit

As described before, a Vandermonde matrix describes the linear algebra interpolation problem of finding the coefficients of a polynomial   of degree   based on the values  , where   are distinct points. If   are not distinct, then this problem does not have a unique solution (which is reflected by the fact that the corresponding Vandermonde matrix is singular). However, if we give the values of the derivatives at the repeated points, then the problem can have a unique solution. For example, the problem

 

where   is a polynomial of degree ≤ 2, has a unique solution for all  . In general, suppose that   are (not necessarily distinct) numbers, and suppose for ease of notation that equal values come in continuous sequences in the list. That is

 

where     and   are distinct. Then the corresponding interpolation problem is

 

And the corresponding matrix for this problem is called a confluent Vandermonde matrices. In our case (which is the general case, up to permuting the rows of the matrix) the formula for it is given as follows: if  , then   for some (unique)   (we consider  ). Then, we have

 

This generalization of the Vandermonde matrix makes it non-singular (such that there exists a unique solution to the system of equations) while retaining most properties of the Vandermonde matrix. Its rows are derivatives (of some order) of the original Vandermonde rows.

Another way to receive this formula is to let some of the  's go arbitrarily close to each other. For example, if  , then letting   in the original Vandermonde matrix, the difference between the first and second rows yields the corresponding row in the confluent Vandermonde matrix. This allows us to link the generalized interpolation problem (given value and derivatives on a point) to the original case where all points are distinct: Being given   is similar to being given   where   is very small.

See alsoEdit

ReferencesEdit

  1. ^ Roger A. Horn and Charles R. Johnson (1991), Topics in matrix analysis, Cambridge University Press. See Section 6.1.
  2. ^ a b Macon, N.; A. Spitzbart (February 1958). "Inverses of Vandermonde Matrices". The American Mathematical Monthly. 65 (2): 95–100. doi:10.2307/2308881. JSTOR 2308881.
  3. ^ Turner, L. Richard (August 1966). Inverse of the Vandermonde matrix with applications (PDF).
  4. ^ "Inverse of Vandermonde Matrix". 2018.
  5. ^ François Viète (1540-1603), Vieta's formulas, https://en.wikipedia.org/wiki/Vieta%27s_formulas
  6. ^ Press, WH; Teukolsky, SA; Vetterling, WT; Flannery, BP (2007). "Section 2.8.1. Vandermonde Matrices". Numerical Recipes: The Art of Scientific Computing (3rd ed.). New York: Cambridge University Press. ISBN 978-0-521-88068-8.
  7. ^ Inverse of Vandermonde Matrix (2018), https://proofwiki.org/wiki/Inverse_of_Vandermonde_Matrix
  8. ^ Fulton, William; Harris, Joe (1991). Representation theory. A first course. Graduate Texts in Mathematics, Readings in Mathematics. Vol. 129. New York: Springer-Verlag. doi:10.1007/978-1-4612-0979-9. ISBN 978-0-387-97495-8. MR 1153249. OCLC 246650103. Lecture 4 reviews the representation theory of symmetric groups, including the role of the Vandermonde determinant.
  9. ^ Gauthier, J. "Fast Multipoint Evaluation On n Arbitrary Points." Simon Fraser University, Tech. Rep (2017).

Further readingEdit

External linksEdit