ART

In linear algebra, an alternant matrix is a matrix formed by applying a finite list of functions pointwise to a fixed column of inputs. An alternant determinant is the determinant of a square alternant matrix.

Generally, if \( {\displaystyle f_{1},f_{2},\dots f_{n}} \) are functions from a set X to a field F, and \( {\displaystyle {\alpha _{1},\alpha _{2},...\alpha _{m}}\in X} \), then the alternant matrix has size \( m\times n \) and is defined by

\( M=\begin{bmatrix} f_1(\alpha_1) & f_2(\alpha_1) & \dots & f_n(\alpha_1)\\ f_1(\alpha_2) & f_2(\alpha_2) & \dots & f_n(\alpha_2)\\ f_1(\alpha_3) & f_2(\alpha_3) & \dots & f_n(\alpha_3)\\ \vdots & \vdots & \ddots &\vdots \\ f_1(\alpha_m) & f_2(\alpha_m) & \dots & f_n(\alpha_m)\\ \end{bmatrix}

or, more compactly, \( {\displaystyle M_{ij}=f_{j}(\alpha _{i})} \). (Some authors use the transpose of the above matrix.) Examples of alternant matrices include Vandermonde matrices, for which \( {\displaystyle f_{j}(\alpha )=\alpha ^{j-1}} \), and Moore matrices, for which \( {\displaystyle f_{j}(\alpha )=\alpha ^{q^{j-1}}}. \)

Properties

The alternant can be used to check the linear independence of the functions \( {\displaystyle f_{1},f_{2},\dots f_{n}} \) in function space. For example, let \( {\displaystyle f_{1}(x)=\sin(x),f_{2}(x)=\cos(x)} \) and choose \( {\displaystyle \alpha _{1}=0,\alpha _{2}=\pi /2} \). Then the alternant is the matrix \( {\displaystyle {\begin{bmatrix}0&1\\1&0\end{bmatrix}}} \) and the alternant determinant is \( {\displaystyle -1\neq 0} \). Therefore M is invertible and the vectors \( {\displaystyle \{\sin(x),\cos(x)\}} \) form a basis for their spanning set: in particular, \( \sin(x) \) and \( \cos(x) \) are linearly independent.

Linear dependence of the columns of an alternant does not imply that the functions are linearly dependent in function space. For example, let \( {\displaystyle f_{1}(x)=\sin(x),f_{2}(x)=\cos(x)} \) and choose \( {\displaystyle \alpha _{1}=0,\alpha _{2}=\pi } \). Then the alternant is \( {\displaystyle {\begin{bmatrix}0&1\\0&-1\end{bmatrix}}} \)and the alternant determinant is 0, but we have already seen that \( \sin(x) \) and \( \cos(x) \) are linearly independent.

Despite this, the alternant can be used to find a linear dependence if it is already known that one exists. For example, we know from the theory of partial fractions that there are real numbers A and B for which \( {\displaystyle {\frac {A}{x+1}}+{\frac {B}{x+2}}={\frac {1}{(x+1)(x+2)}}.} \) Choosing \( {\displaystyle f_{1}(x)={\frac {1}{x+1}},f_{2}(x)={\frac {1}{x+2}},f_{3}={\frac {1}{(x+1)(x+2)}}} \) and \( {\displaystyle (\alpha _{1},\alpha _{2},\alpha _{3})=(1,2,3)} \), we obtain the alternant \( {\displaystyle {\begin{bmatrix}1/2&1/3&1/6\\1/3&1/4&1/12\\1/4&1/5&1/20\end{bmatrix}}\sim {\begin{bmatrix}1&0&1\\0&1&-1\\0&0&0\end{bmatrix}}.} \) Therefore \( {\displaystyle (1,-1,-1)} \) is in the nullspace of the matrix: that is, \( {\displaystyle f_{1}-f_{2}-f_{3}=0} \). Moving \( f_{3} \) to the other side of the equation gives the partial fraction decomposition \( {\displaystyle A=1,B=-1} \).

If n n = m and \( \alpha_i = \alpha_j \) for any \( i\neq j \), then the alternant determinant is zero (as a row is repeated).

If n = m and the functions \( f_j(x) \)are all polynomials, then \( (\alpha_j - \alpha_i) \) divides the alternant determinant for all \( 1 \leq i < j \leq n \). In particular, if V is a Vandermonde matrix, then \( \prod_{i < j} (\alpha_j - \alpha_i) = \det V \) divides such polynomial alternant determinants. The ratio \( \frac{\det M}{\det V} \) is therefore a polynomial in \( {\displaystyle \alpha _{1},...,\alpha _{m}} \) called the bialternant. The Schur polynomial \( {\displaystyle s_{(\lambda _{1},\dots ,\lambda _{n})}} \) is classically defined as the bialternant of the polynomials \( {\displaystyle f_{j}(x)=x^{\lambda _{j}}} \).

Applications

Alternant matrices are used in coding theory in the construction of alternant codes.

See also

List of matrices
Wronskian

References
Thomas Muir (1960). A treatise on the theory of determinants. Dover Publications. pp. 321–363.
A. C. Aitken (1956). Determinants and Matrices. Oliver and Boyd Ltd. pp. 111–123.
Richard P. Stanley (1999). Enumerative Combinatorics. Cambridge University Press. pp. 334–342.

Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License