[SciPy-dev] SciPy improvements
Michael McNeil Forbes
Thu Apr 12 16:31:54 CDT 2007
On 11 Apr 2007, at 1:56 PM, Matthieu Brucher wrote:
> 2) Nonlinear solvers
> I have written these nonlinear solvers for the problem R(x) = 0, where
> x and R has a dimension "n":
> broyden1 - Broyden's first method - is a quasi-Newton-Raphson
> method for
> updating an approximate Jacobian and then inverting it
> broyden2 - Broyden's second method - the same as broyden1, but
> updates the
> inverse Jacobian directly
> broyden3 - Broyden's second method - the same as broyden2, but
> instead of
> directly computing the inverse Jacobian, it remembers how
> to construct
> it using vectors, and when computing inv(J)*F, it uses
> those vectors to
> compute this product, thus avoding the expensive NxN matrix
> broyden_generalized - Generalized Broyden's method, the same as
> but instead of approximating the full NxN Jacobian, it
> construct it at
> every iteration in a way that avoids the NxN matrix
> This is not as precise as broyden3.
> anderson - extended Anderson method, the same as the
> but added w_0^2*I to before taking inversion to improve the
> anderson2 - the Anderson method, the same as anderson, but
> I use them in the self-consistent cycle of the Density Functional
> Theory (so I use a terminology of DFT literature in the names of the
> Could the part that computes the step be separated from the
> function iself and the optimizer ? I'm trying to "modularize" non
> linear solvers so as to select more efficiently what is needed -
> kind of optimizer, kind of step, kind of stopping criterion, ... -
It should be possible to modularize these with a step class that
(the Jacobian, or its inverse etc.).
(Where is the latest version of your optimization proposal? I have
not had a chance
to look at it yet, but have been meaning to and would like to look at
the latest version.
Perhaps we should make a page on the Wiki to collect suggestions and
I have been meaning to get a good Broyden based algorithm coded for
python for a
while. I have a MATLAB version of a globally convergent Broyden
using a linesearch as a base with a couple of unusual features that
useful (not specific to Broyden based methods). (The code is based
presentation of the algorithm given in Numerical Recipies with some
suggested by Dennis and Schnabel's book and is partially documented
1) Variable tolerances. The idea is to quickly estimate the starting
with low tolerance calculations and then improve the tolerances
as the code
converges to the solution. This is useful if the function R(x)
with numerical integration or some similar technique that is
quick for low
tolerances but expensive for high tolerance functions. (I have
this technique in generic optimization algorithms, but found it
2) Real-time bailout. This allows you to compute for a specified
length of time
and then return the best solution found within that time frame.
simply count function calls.
3) Selective refinement of the Jacobian as the iteration proceeds.
to monitoring the condition number of the Jacobian and
recomputing parts of it
selectively if it becomes ill-conditioned. (For this I update
factorization of J rather than maintaining inv(J)).
These things all slow down the fundamental algorithm, but are very
useful when the
function R(x) is extremely expensive to evaluate.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Scipy-dev