Wed Jun 26 11:47:07 CDT 2013
On Wed, Jun 26, 2013 at 12:26 PM, Frédéric Parrenin
> Dear all,
> I am experimenting the optimize module of scipy.
> My optimization problem is a leastsq problem.
> However, the leastsq function seems to be not appropriate for two reasons:
> - there is no possibility to specify a covariance matrix between the leastsq
> terms. They are supposed to be independent, which is a too strong assumption
> in my case.
> - the analyzed covariance matrix (i.e. the inverse of the jacobian of the
> cost function) cannot be simply outputed.
> Of course I could use a more generic optimization function, like the
> minimize one.
> However this seems sub-optimal because the minimisation of a least squares
> problem can dealt more efficiently (the jacobian of the cost function can be
> approximated using the jacobian of the terms to minimize).
> Can anybody help me?
> Are there plans to improve the leastsq function?
leastsq is a low level function and I think we should not load it up
with any options.
for weighted least-squares the more highlevel interface with
additional results is optimize.curve_fit.
However it doesn't allow for a full covariance matrix for the errors.
If you want to use leastsq with a full covariance matrix, then you
could transform both sides yourself, similar to what is done in
curve_fit, but with the cholesky of the inverse covariance matrix.
We use that in statsmodels.GLS, but only for linear models.
But, if there a large number of observations, then using the full
covariance matrix is inefficient, and in many cases a more direct
transformation can be used.
nonlinear least squares is still largely missing in statsmodels.
I don't know if any of the other packages that are based on leastsq
have the option.
> Best regards,
> Frédéric Parrenin
> SciPy-User mailing list
More information about the SciPy-User