ART

In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.

A distance between populations can be interpreted as measuring the distance between two probability distributions and hence they are essentially measures of distances between probability measures. Where statistical distance measures relate to the differences between random variables, these may have statistical dependence,[1] and hence these distances are not directly related to measures of distances between probability measures. Again, a measure of distance between random variables may relate to the extent of dependence between them, rather than to their individual values.

Statistical distance measures are mostly not metrics and they need not be symmetric. Some types of distance measures are referred to as (statistical) divergences.

Terminology

Many terms are used to refer to various notions of distance; these are often confusingly similar, and may be used inconsistently between authors and over time, either loosely or with precise technical meaning. In addition to "distance", similar terms include deviance, deviation, discrepancy, discrimination, and divergence, as well as others such as contrast function and metric. Terms from information theory include cross entropy, relative entropy, discrimination information, and information gain.

Distances as metrics
Metrics

A metric on a set X is a function (called the distance function or simply distance)

d : X × X → R+ (where R+ is the set of non-negative real numbers). For all x, y, z in X, this function is required to satisfy the following conditions:

d(x, y) ≥ 0 (non-negativity)
d(x, y) = 0 if and only if x = y (identity of indiscernibles. Note that condition 1 and 2 together produce positive definiteness)
d(x, y) = d(y, x) (symmetry)
d(x, z) ≤ d(x, y) + d(y, z) (subadditivity / triangle inequality).

Generalized metrics

Many statistical distances are not metrics, because they lack one or more properties of proper metrics. For example, pseudometrics violate the "positive definiteness" (alternatively, "identity of indescernibles") property (1 & 2 above); quasimetrics violate the symmetry property (3); and semimetrics violate the triangle inequality (4). Statistical distances that satisfy (1) and (2) are referred to as divergences.
Examples

Some important statistical distances include the following:

Other approaches

See also

Probabilistic metric space
This article includes a list of references, but its sources remain unclear because it has insufficient inline citations. Please help to improve this article by introducing more precise citations. (February 2012) (Learn how and when to remove this template message)

This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Statistical distance" – news · newspapers · books · scholar · JSTOR (February 2012) (Learn how and when to remove this template message)
Notes

Dodge, Y. (2003)—entry for distance

External links

Distance and Similarity Measures(Wolfram Alpha)

Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License