Add doc page on computing Least Squares.

This commit is contained in:
Jitse Niesen 2014-01-18 01:16:17 +00:00
parent a58325ac2f
commit aa0db35185
5 changed files with 86 additions and 5 deletions

70
doc/LeastSquares.dox Normal file
View File

@ -0,0 +1,70 @@
namespace Eigen {
/** \eigenManualPage LeastSquares Solving linear least squares systems
This page describes how to solve linear least squares systems using %Eigen. An overdetermined system
of equations, say \a Ax = \a b, has no solutions. In this case, it makes sense to search for the
vector \a x which is closest to being a solution, in the sense that the difference \a Ax - \a b is
as small as possible. This \a x is called the least square solution (if the Euclidean norm is used).
The three methods discussed on this page are the SVD decomposition, the QR decomposition and normal
equations. Of these, the SVD decomposition is generally the most accurate but the slowest, normal
equations is the fastest but least accurate, and the QR decomposition is in between.
\eigenAutoToc
\section LeastSquaresSVD Using the SVD decomposition
The \link JacobiSVD::solve() solve() \endlink method in the JacobiSVD class can be directly used to
solve linear squares systems. It is not enough to compute only the singular values (the default for
this class); you also need the singular vectors but the thin SVD decomposition suffices for
computing least squares solutions:
<table class="example">
<tr><th>Example:</th><th>Output:</th></tr>
<tr>
<td>\include TutorialLinAlgSVDSolve.cpp </td>
<td>\verbinclude TutorialLinAlgSVDSolve.out </td>
</tr>
</table>
This is example from the page \link TutorialLinearAlgebra Linear algebra and decompositions \endlink.
\section LeastSquaresQR Using the QR decomposition
The solve() method in QR decomposition classes also computes the least squares solution. There are
three QR decomposition classes: HouseholderQR (no pivoting, so fast but unstable),
ColPivHouseholderQR (column pivoting, thus a bit slower but more accurate) and FullPivHouseholderQR
(full pivoting, so slowest and most stable). Here is an example with column pivoting:
<table class="example">
<tr><th>Example:</th><th>Output:</th></tr>
<tr>
<td>\include LeastSquaresQR.cpp </td>
<td>\verbinclude LeastSquaresQR.out </td>
</tr>
</table>
\section LeastSquaresNormalEquations Using normal equations
Finding the least squares solution of \a Ax = \a b is equivalent to solving the normal equation
<i>A</i><sup>T</sup><i>Ax</i> = <i>A</i><sup>T</sup><i>b</i>. This leads to the following code
<table class="example">
<tr><th>Example:</th><th>Output:</th></tr>
<tr>
<td>\include LeastSquaresNormalEquations.cpp </td>
<td>\verbinclude LeastSquaresNormalEquations.out </td>
</tr>
</table>
If the matrix \a A is ill-conditioned, then this is not a good method, because the condition number
of <i>A</i><sup>T</sup><i>A</i> is the square of the condition number of \a A. This means that you
lose twice as many digits using normal equation than if you use the other methods.
*/
}

View File

@ -96,6 +96,8 @@ namespace Eigen {
\ingroup DenseLinearSolvers_chapter */ \ingroup DenseLinearSolvers_chapter */
/** \addtogroup TopicLinearAlgebraDecompositions /** \addtogroup TopicLinearAlgebraDecompositions
\ingroup DenseLinearSolvers_chapter */ \ingroup DenseLinearSolvers_chapter */
/** \addtogroup LeastSquares
\ingroup DenseLinearSolvers_chapter */
/** \addtogroup DenseLinearSolvers_Reference /** \addtogroup DenseLinearSolvers_Reference
\ingroup DenseLinearSolvers_chapter */ \ingroup DenseLinearSolvers_chapter */

View File

@ -167,8 +167,8 @@ Here is an example:
\section TutorialLinAlgLeastsquares Least squares solving \section TutorialLinAlgLeastsquares Least squares solving
The best way to do least squares solving is with a SVD decomposition. Eigen provides one as the JacobiSVD class, and its solve() The most accurate method to do least squares solving is with a SVD decomposition. Eigen provides one
is doing least-squares solving. as the JacobiSVD class, and its solve() is doing least-squares solving.
Here is an example: Here is an example:
<table class="example"> <table class="example">
@ -179,9 +179,10 @@ Here is an example:
</tr> </tr>
</table> </table>
Another way, potentially faster but less reliable, is to use a LDLT decomposition Another methods, potentially faster but less reliable, are to use a Cholesky decomposition of the
of the normal matrix. In any case, just read any reference text on least squares, and it will be very easy for you normal matrix or a QR decomposition. Our page on \link LeastSquares least squares solving \endlink
to implement any linear least squares computation on top of Eigen. has more details.
\section TutorialLinAlgSeparateComputation Separating the computation from the construction \section TutorialLinAlgSeparateComputation Separating the computation from the construction

View File

@ -0,0 +1,4 @@
MatrixXf A = MatrixXf::Random(3, 2);
VectorXf b = VectorXf::Random(3);
cout << "The solution using normal equations is:\n"
<< (A.transpose() * A).ldlt().solve(A.transpose() * b) << endl;

View File

@ -0,0 +1,4 @@
MatrixXf A = MatrixXf::Random(3, 2);
VectorXf b = VectorXf::Random(3);
cout << "The solution using the QR decomposition is:\n"
<< A.colPivHouseholderQr().solve(b) << endl;