ART

In probability theory and statistics, the logistic distribution is a continuous probability distribution. Its cumulative distribution function is the logistic function, which appears in logistic regression and feedforward neural networks. It resembles the normal distribution in shape but has heavier tails (higher kurtosis). The logistic distribution is a special case of the Tukey lambda distribution.

Specification
Probability density function

When the location parameter μ is 0 and the scale parameter s is 1, then the probability density function of the logistic distribution is given by

\( {\displaystyle {\begin{aligned}f(x;0,1)&={\frac {e^{-x}}{(1+e^{-x})^{2}}}\\[4pt]&={\frac {1}{(e^{x/2}+e^{-x/2})^{2}}}\\[5pt]&={\frac {1}{4}}\operatorname {sech} ^{2}\left({\frac {x}{2}}\right).\end{aligned}}} \)

Thus in general the density is:

\( {\displaystyle {\begin{aligned}f(x;\mu ,s)&={\frac {e^{-(x-\mu )/s}}{s\left(1+e^{-(x-\mu )/s}\right)^{2}}}\\[4pt]&={\frac {1}{s\left(e^{(x-\mu )/(2s)}+e^{-(x-\mu )/(2s)}\right)^{2}}}\\[4pt]&={\frac {1}{4s}}\operatorname {sech} ^{2}\left({\frac {x-\mu }{2s}}\right).\end{aligned}}} \)

Because this function can be expressed in terms of the square of the hyperbolic secant function "sech", it is sometimes referred to as the sech-square(d) distribution.[1]

See also: hyperbolic secant distribution

Cumulative distribution function

The logistic distribution receives its name from its cumulative distribution function, which is an instance of the family of logistic functions. The cumulative distribution function of the logistic distribution is also a scaled version of the hyperbolic tangent.

\( {\displaystyle F(x;\mu ,s)={\frac {1}{1+e^{-(x-\mu )/s}}}={\frac {1}{2}}+{\frac {1}{2}}\operatorname {tanh} \left({\frac {x-\mu }{2s}}\right).} \)

In this equation, x is the random variable, μ is the mean, and s is a scale parameter proportional to the standard deviation.
Quantile function

The inverse cumulative distribution function (quantile function) of the logistic distribution is a generalization of the logit function. Its derivative is called the quantile density function. They are defined as follows:

\( {\displaystyle Q(p;\mu ,s)=\mu +s\ln \left({\frac {p}{1-p}}\right).} \)

\( Q'(p;s)={\frac {s}{p(1-p)}}. \)

Alternative parameterization

An alternative parameterization of the logistic distribution can be derived by expressing the scale parameter, s, in terms of the standard deviation, \( \sigma \) , using the substitution \( s\,=\,q\,\sigma \) , where \( {\displaystyle q\,=\,{\sqrt {3}}/{\pi }\,=\,0.551328895\ldots }. \) The alternative forms of the above functions are reasonably straightforward.
Applications

The logistic distribution—and the S-shaped pattern of its cumulative distribution function (the logistic function) and quantile function (the logit function)—have been extensively used in many different areas.
Logistic regression

One of the most common applications is in logistic regression, which is used for modeling categorical dependent variables (e.g., yes-no choices or a choice of 3 or 4 possibilities), much as standard linear regression is used for modeling continuous variables (e.g., income or population). Specifically, logistic regression models can be phrased as latent variable models with error variables following a logistic distribution. This phrasing is common in the theory of discrete choice models, where the logistic distribution plays the same role in logistic regression as the normal distribution does in probit regression. Indeed, the logistic and normal distributions have a quite similar shape. However, the logistic distribution has heavier tails, which often increases the robustness of analyses based on it compared with using the normal distribution.
Physics

The PDF of this distribution has the same functional form as the derivative of the Fermi function. In the theory of electron properties in semiconductors and metals, this derivative sets the relative weight of the various electron energies in their contributions to electron transport. Those energy levels whose energies are closest to the distribution's "mean" (Fermi level) dominate processes such as electronic conduction, with some smearing induced by temperature.[2]:34 Note however that the pertinent probability distribution in Fermi–Dirac statistics is actually a simple Bernoulli distribution, with the probability factor given by the Fermi function.

The logistic distribution arises as limit distribution of a finite-velocity damped random motion described by a telegraph process in which the random times between consecutive velocity changes have independent exponential distributions with linearly increasing parameters.[3]
Hydrology
Fitted cumulative logistic distribution to October rainfalls using CumFreq, see also Distribution fitting

In hydrology the distribution of long duration river discharge and rainfall (e.g., monthly and yearly totals, consisting of the sum of 30 respectively 360 daily values) is often thought to be almost normal according to the central limit theorem.[4] The normal distribution, however, needs a numeric approximation. As the logistic distribution, which can be solved analytically, is similar to the normal distribution, it can be used instead. The blue picture illustrates an example of fitting the logistic distribution to ranked October rainfalls—that are almost normally distributed—and it shows the 90% confidence belt based on the binomial distribution. The rainfall data are represented by plotting positions as part of the cumulative frequency analysis.
Chess ratings

The United States Chess Federation and FIDE have switched its formula for calculating chess ratings from the normal distribution to the logistic distribution; see the article on Elo rating system (itself based on the normal distribution).
Related distributions

Logistic distribution mimics the sech distribution.
If X ~ Logistic(μ, β) then kX + ℓ ~ Logistic(kμ + ℓ, kβ).
If X ~ U(0, 1) then μ + β(log(X) − log(1 − X)) ~ Logistic(μ, β).
If \( {\displaystyle X\sim \mathrm {Gumbel} (\alpha _{X},\beta )} \) and \( {\displaystyle Y\sim \mathrm {Gumbel} (\alpha _{Y},\beta )} \) then \( {\displaystyle X-Y\sim \mathrm {Logistic} (\alpha _{X}-\alpha _{Y},\beta )\,}. \)
If X and \( {\displaystyle Y\sim \mathrm {Gumbel} (\alpha ,\beta )}\) then \( {\displaystyle X+Y\nsim \mathrm {Logistic} (2\alpha ,\beta )\,} \) (The sum is not a logistic distribution). Note that \( {\displaystyle E(X+Y)=2\alpha +2\beta \gamma \neq 2\alpha =E\left(\mathrm {Logistic} (2\alpha ,\beta )\right)}. \)
If X ~ Logistic(μ, s) then exp(X) ~ LogLogistic ( \( {\displaystyle \left(\alpha =e^{\mu },\beta ={\frac {1}{s}}\right)} \), and exp(X) + γ ~ shifted log-logistic

\( {\displaystyle \left(\alpha =e^{\mu },\beta ={\frac {1}{s}},\gamma \right)}. \)

If X ~ Exponential(1) then

\( {\displaystyle \mu +\beta \log(e^{X}-1)\sim \operatorname {Logistic} (\mu ,\beta ).} \)

If X, Y ~ Exponential(1) then

\( {\displaystyle \mu -\beta \log \left({\frac {X}{Y}}\right)\sim \operatorname {Logistic} (\mu ,\beta ).} \)

Derivations
Higher-order moments

The nth-order central moment can be expressed in terms of the quantile function:
\( {\displaystyle {\begin{aligned}\operatorname {E} [(X-\mu )^{n}]&=\int _{-\infty }^{\infty }(x-\mu )^{n}\,dF(x)\\&=\int _{0}^{1}{\big (}Q(p)-\mu {\big )}^{n}\,dp=s^{n}\int _{0}^{1}\left[\ln \!\left({\frac {p}{1-p}}\right)\right]^{n}\,dp.\end{aligned}}} \)

This integral is well-known[5] and can be expressed in terms of Bernoulli numbers:

\( {\displaystyle \operatorname {E} [(X-\mu )^{n}]=s^{n}\pi ^{n}(2^{n}-2)\cdot |B_{n}|.} \)

See also

Generalized logistic distribution
Tukey lambda distribution
Logistic regression
Log-logistic distribution
Sigmoid function

Notes

Johnson, Kotz & Balakrishnan (1995, p.116).
Davies, John H. (1998). The Physics of Low-dimensional Semiconductors: An Introduction. Cambridge University Press. ISBN 9780521484916.
A. Di Crescenzo, B. Martinucci (2010) "A damped telegraph random process with logistic stationary distribution", J. Appl. Prob., vol. 47, pp. 84–96.
Ritzema, H.P., ed. (1994). Frequency and Regression Analysis. Chapter 6 in: Drainage Principles and Applications, Publication 16, International Institute for Land Reclamation and Improvement (ILRI), Wageningen, The Netherlands. pp. 175–224. ISBN 90-70754-33-9.

OEIS: A001896

References
Wikimedia Commons has media related to Logistic distribution.

John S. deCani & Robert A. Stine (1986). "A note on deriving the information matrix for a logistic distribution". The American Statistician. American Statistical Association. 40: 220–222. doi:10.2307/2684541.
N., Balakrishnan (1992). Handbook of the Logistic Distribution. Marcel Dekker, New York. ISBN 0-8247-8587-8.
Johnson, N. L.; Kotz, S.; N., Balakrishnan (1995). Continuous Univariate Distributions. Vol. 2 (2nd ed.). ISBN 0-471-58494-0.

Modis, Theodore (1992) Predictions: Society's Telltale Signature Reveals the Past and Forecasts the Future, Simon & Schuster, New York. ISBN 0-671-75917-5

Probability distributions (List)
Discrete univariate
with finite support

Benford Bernoulli beta-binomial binomial categorical hypergeometric Poisson binomial Rademacher soliton discrete uniform Zipf Zipf–Mandelbrot

Discrete univariate
with infinite support

beta negative binomial Borel Conway–Maxwell–Poisson discrete phase-type Delaporte extended negative binomial Flory–Schulz Gauss–Kuzmin geometric logarithmic negative binomial parabolic fractal Poisson Skellam Yule–Simon zeta

Continuous univariate
supported on a bounded interval

arcsine ARGUS Balding–Nichols Bates beta beta rectangular continuous Bernoulli Irwin–Hall Kumaraswamy logit-normal noncentral beta raised cosine reciprocal triangular U-quadratic uniform Wigner semicircle

Continuous univariate
supported on a semi-infinite interval

Benini Benktander 1st kind Benktander 2nd kind beta prime Burr chi-squared chi Dagum Davis exponential-logarithmic Erlang exponential F folded normal Fréchet gamma gamma/Gompertz generalized gamma generalized inverse Gaussian Gompertz half-logistic half-normal Hotelling's T-squared hyper-Erlang hyperexponential hypoexponential inverse chi-squared
scaled inverse chi-squared inverse Gaussian inverse gamma Kolmogorov Lévy log-Cauchy log-Laplace log-logistic log-normal Lomax matrix-exponential Maxwell–Boltzmann Maxwell–Jüttner Mittag-Leffler Nakagami noncentral chi-squared noncentral F Pareto phase-type poly-Weibull Rayleigh relativistic Breit–Wigner Rice shifted Gompertz truncated normal type-2 Gumbel Weibull
discrete Weibull Wilks's lambda

Continuous univariate
supported on the whole real line

Cauchy exponential power Fisher's z Gaussian q generalized normal generalized hyperbolic geometric stable Gumbel Holtsmark hyperbolic secant Johnson's SU Landau Laplace asymmetric Laplace logistic noncentral t normal (Gaussian) normal-inverse Gaussian skew normal slash stable Student's t type-1 Gumbel Tracy–Widom variance-gamma Voigt

Continuous univariate
with support whose type varies

generalized chi-squared generalized extreme value generalized Pareto Marchenko–Pastur q-exponential q-Gaussian q-Weibull shifted log-logistic Tukey lambda

Mixed continuous-discrete univariate

rectified Gaussian

Multivariate (joint)

Discrete
Ewens
multinomial
Dirichlet-multinomial
negative multinomial
Continuous
Dirichlet
generalized Dirichlet
multivariate Laplace
multivariate normal
multivariate stable
multivariate t
normal-inverse-gamma
normal-gamma
Matrix-valued
inverse matrix gamma
inverse-Wishart
matrix normal
matrix t
matrix gamma
normal-inverse-Wishart
normal-Wishart
Wishart

Directional

Univariate (circular) directional
Circular uniform
univariate von Mises
wrapped normal
wrapped Cauchy
wrapped exponential
wrapped asymmetric Laplace
wrapped Lévy
Bivariate (spherical)
Kent
Bivariate (toroidal)
bivariate von Mises
Multivariate
von Mises–Fisher
Bingham

Degenerate and singular

Degenerate
Dirac delta function
Singular
Cantor

Families

Circular compound Poisson elliptical exponential natural exponential location–scale maximum entropy mixture Pearson Tweedie wrapped

Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License