ART

In optimization, a gradient method is an algorithm to solve problems of the form

\( \min _{{x\in {\mathbb R}^{n}}}\;f(x) \)

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.
See also

Gradient descent
Stochastic gradient descent
Coordinate descent
Frank–Wolfe algorithm
Landweber iteration
Random coordinate descent
Conjugate gradient method
Derivation of the conjugate gradient method
Nonlinear conjugate gradient method
Biconjugate gradient method
Biconjugate gradient stabilized method

References
Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.


Undergraduate Texts in Mathematics

Graduate Texts in Mathematics

Graduate Studies in Mathematics

Mathematics Encyclopedia

World

Index

Hellenica World - Scientific Library

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License