ART

In linear algebra, an orthogonal transformation is a linear transformation T : V → V on a real inner product space V, that preserves the inner product. That is, for each pair u, v of elements of V, we have[1]

\( \langle u,v \rangle = \langle Tu,Tv \rangle \, . \)

Since the lengths of vectors and the angles between them are defined through the inner product, orthogonal transformations preserve lengths of vectors and angles between them. In particular, orthogonal transformations map orthonormal bases to orthonormal bases.

Orthogonal transformations in two- or three-dimensional Euclidean space are stiff rotations, reflections, or combinations of a rotation and a reflection (also known as improper rotations). Reflections are transformations that reverse the direction front to back, orthogonal to the mirror plane, like (real-world) mirrors do. The matrices corresponding to proper rotations (without reflection) have a determinant of +1. Transformations with reflection are represented by matrices with a determinant of −1. This allows the concept of rotation and reflection to be generalized to higher dimensions.

In finite-dimensional spaces, the matrix representation (with respect to an orthonormal basis) of an orthogonal transformation is an orthogonal matrix. Its rows are mutually orthogonal vectors with unit norm, so that the rows constitute an orthonormal basis of V. The columns of the matrix form another orthonormal basis of V.

The inverse of an orthogonal transformation is another orthogonal transformation. Its matrix representation is the transpose of the matrix representation of the original transformation.
Examples

Consider the inner-product space (\( {\displaystyle (\mathbb {R} ^{2},\langle \cdot ,\cdot \rangle )} \) with the standard euclidean inner product and standard basis. Then, the matrix transformation

\( {\displaystyle T={\begin{bmatrix}\cos(\theta )&-\sin(\theta )\\\sin(\theta )&\cos(\theta )\end{bmatrix}}:\mathbb {R} ^{2}\to \mathbb {R} ^{2}}\)

is orthogonal. To see this, consider

\({\displaystyle {\begin{aligned}Te_{1}={\begin{bmatrix}\cos(\theta )\\\sin(\theta )\end{bmatrix}}&&Te_{2}={\begin{bmatrix}-\sin(\theta )\\\cos(\theta )\end{bmatrix}}\end{aligned}}} \)

Then,

\( {\displaystyle {\begin{aligned}&\langle Te_{1},Te_{1}\rangle ={\begin{bmatrix}\cos(\theta )&\sin(\theta )\end{bmatrix}}\cdot {\begin{bmatrix}\cos(\theta )\\\sin(\theta )\end{bmatrix}}=\cos ^{2}(\theta )+\sin ^{2}(\theta )=1\\&\langle Te_{1},Te_{2}\rangle ={\begin{bmatrix}\cos(\theta )&\sin(\theta )\end{bmatrix}}\cdot {\begin{bmatrix}-\sin(\theta )\\\cos(\theta )\end{bmatrix}}=\sin(\theta )\cos(\theta )-\sin(\theta )\cos(\theta )=0\\&\langle Te_{2},Te_{2}\rangle ={\begin{bmatrix}-\sin(\theta )&\cos(\theta )\end{bmatrix}}\cdot {\begin{bmatrix}-\sin(\theta )\\\cos(\theta )\end{bmatrix}}=\sin ^{2}(\theta )+\cos ^{2}(\theta )=1\\\end{aligned}}} \)

The previous example can be extended to construct all orthogonal transformations. For example, the following matrices define orthogonal transformations on \( {\displaystyle (\mathbb {R} ^{3},\langle \cdot ,\cdot \rangle )} \) :

\( {\displaystyle {\begin{bmatrix}\cos(\theta )&-\sin(\theta )&0\\\sin(\theta )&\cos(\theta )&0\\0&0&1\end{bmatrix}},{\begin{bmatrix}\cos(\theta )&0&-\sin(\theta )\\0&1&0\\\sin(\theta )&0&\cos(\theta )\end{bmatrix}},{\begin{bmatrix}1&0&0\\0&\cos(\theta )&-\sin(\theta )\\0&\sin(\theta )&\cos(\theta )\end{bmatrix}}} \)

See also

Improper rotation
Linear transformation
Orthogonal matrix
Unitary transformation

References

Rowland, Todd. "Orthogonal Transformation". MathWorld. Retrieved 4 May 2012.

Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License