The Conjugate Gradient Method is the most prominent iterative method for
solving sparse systems of linear equations. Unfortunately, many textbook
treatments of the topic are written with neither illustrations nor intuition,
and their victims can be found to this day babbling senselessly in the corners
of dusty libraries. For this reason, a deep, geometric understanding of the
method has been reserved for the elite brilliant few who have painstakingly
decoded the mumblings of their forebears. Nevertheless, the Conjugate Gradient
Method is a composite of simple, elegant ideas that almost anyone can
understand. Of course, a reader as intelligent as yourself will learn them
almost effortlessly.
The idea of quadratic forms is introduced and used to derive the methods of
Steepest Descent, Conjugate Directions, and Conjugate Gradients. Eigenvectors
are explained and used to examine the convergence of the Jacobi Method,
Steepest Descent, and Conjugate Gradients. Other topics include
preconditioning and the nonlinear Conjugate Gradient Method. I have taken
pains to make this article easy to read. Sixty-two illustrations are
provided. Dense prose is avoided. Concepts are explained in several
different ways. Most equations are coupled with an intuitive interpretation.
%0 Journal Article
%1 Shewchuk94
%A Shewchuk, Jonathan Richard
%D 1994
%K machine-learning
%T An Introduction to the Conjugate Gradient Method Without the Agonizing Pain
%U http://www.cs.cmu.edu/~quake-papers/painless-conjugate-gradient.pdf
%X The Conjugate Gradient Method is the most prominent iterative method for
solving sparse systems of linear equations. Unfortunately, many textbook
treatments of the topic are written with neither illustrations nor intuition,
and their victims can be found to this day babbling senselessly in the corners
of dusty libraries. For this reason, a deep, geometric understanding of the
method has been reserved for the elite brilliant few who have painstakingly
decoded the mumblings of their forebears. Nevertheless, the Conjugate Gradient
Method is a composite of simple, elegant ideas that almost anyone can
understand. Of course, a reader as intelligent as yourself will learn them
almost effortlessly.
The idea of quadratic forms is introduced and used to derive the methods of
Steepest Descent, Conjugate Directions, and Conjugate Gradients. Eigenvectors
are explained and used to examine the convergence of the Jacobi Method,
Steepest Descent, and Conjugate Gradients. Other topics include
preconditioning and the nonlinear Conjugate Gradient Method. I have taken
pains to make this article easy to read. Sixty-two illustrations are
provided. Dense prose is avoided. Concepts are explained in several
different ways. Most equations are coupled with an intuitive interpretation.
@article{Shewchuk94,
abstract = {The Conjugate Gradient Method is the most prominent iterative method for
solving sparse systems of linear equations. Unfortunately, many textbook
treatments of the topic are written with neither illustrations nor intuition,
and their victims can be found to this day babbling senselessly in the corners
of dusty libraries. For this reason, a deep, geometric understanding of the
method has been reserved for the elite brilliant few who have painstakingly
decoded the mumblings of their forebears. Nevertheless, the Conjugate Gradient
Method is a composite of simple, elegant ideas that almost anyone can
understand. Of course, a reader as intelligent as yourself will learn them
almost effortlessly.
The idea of quadratic forms is introduced and used to derive the methods of
Steepest Descent, Conjugate Directions, and Conjugate Gradients. Eigenvectors
are explained and used to examine the convergence of the Jacobi Method,
Steepest Descent, and Conjugate Gradients. Other topics include
preconditioning and the nonlinear Conjugate Gradient Method. I have taken
pains to make this article easy to read. Sixty-two illustrations are
provided. Dense prose is avoided. Concepts are explained in several
different ways. Most equations are coupled with an intuitive interpretation.
},
added-at = {2010-02-25T01:03:40.000+0100},
author = {Shewchuk, Jonathan Richard},
biburl = {https://www.bibsonomy.org/bibtex/2175fea4e807e879ebcb76b79a9992437/nijkamp},
interhash = {c21a3e8eda209938fe2c16ba4a8221ab},
intrahash = {175fea4e807e879ebcb76b79a9992437},
keywords = {machine-learning},
month = {August},
timestamp = {2010-06-16T11:05:25.000+0200},
title = {An Introduction to the Conjugate Gradient Method Without the Agonizing Pain},
url = {http://www.cs.cmu.edu/~quake-papers/painless-conjugate-gradient.pdf},
year = 1994
}