[SciPy-dev] code for incremental least squares

josef.pktd@gmai... josef.pktd@gmai...
Wed Feb 17 09:10:05 CST 2010

On Wed, Feb 17, 2010 at 3:32 AM, Nathaniel Smith <njs@pobox.com> wrote:
> 2010/2/16 Stéfan van der Walt <stefan@sun.ac.za>:
>> Forming the product A^T A is often a bad idea from a numerical
>> perspective.
> Right.
>> In Chapter 3 of Ake Bjork's "Numerical Methods for Least
>> Squares Problems", he talks about "update" problems.  He mentions that
>> "...the solution should be accurate up to the limitation of data and
>> conditioning of the problem; i.e., a stable numerical method must be
>> used."
>> He describes the Kalman-based update method (which bears some
>> resemblance to yours), but says that "the main disadvantage...is its
>> serious sensitivity to roundoff errors.  The updating algorithms based
>> on orthogonal transformations developed in the following sections are
>> generally to be preferred."  He then goes into more detail on updating
>> the QR and Gram-Schmidt decompositions.
>> Not sure if that helps, but it may be worth reading that chapter.
> Definitely -- thanks for the reference! Looks like the library has a
> copy available, too...
> (Not that I'll necessarily bother for my own use, since like I said,
> my problems seem to be fairly well conditioned and the sparse X'X +
> Cholesky approach is very fast, and also generalizes easily to mixed
> effect models, which I'm currently working on implementing.)
>>> For the QR code I use a recurrence relation I stole from some slides
>>> by Simon Wood to compute R and Q'y incrementally; probably a real
>>> incremental QR (e.g., "AS274", which is what R's biglm package uses)
>>> would be better, but this one is easy to implement in terms of
>>> non-incremental QR.
>> Incremental QR is something we should implement in scipy.linalg, for sure.
> +1

Just a follow up question for the experts.

In the description of some estimators and the optimization routines
used, I have read that the updating is done on the cholesky or QR
decomposition of the inverse matrix ( (x'x)^{-1} or inverse Hessian).
Except for simple least squares problem the inverse is often more
important than the the original X'X.

Are there algorithms available for this, and is there an advantage for
either way? From what I have seen in incremental_ls, the updating
works on QR of x'x and then the inverse of the QR decomposition is
taken. Assuming a well behaved non-singular X'X.

I will need more time to work through this but will come back to it soon.
from njs.util import block_diagonal_stack   seems to be the only
function that is missing



> -- Nathaniel
> _______________________________________________
> SciPy-Dev mailing list
> SciPy-Dev@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-dev

More information about the SciPy-Dev mailing list