Thu Jun 27 09:13:33 CDT 2013
Yes, leastsq is probably what I need.
As Josef suggested, I can decompose the observation covariance matrix using
Choleski to transform the model into one with independent observations.
It is still not very clear how to obtain the analyzed (or posterior)
covariance matrix around the solution.
At first glance, cov_x is what we are looking for but when looking at the
doc, it specifies:
Uses the fjac and ipvt optional outputs to construct an estimate of the
jacobian around the solution. None if a singular matrix encountered
(indicates very flat curvature in some direction). This matrix must be
multiplied by the residual variance to get the covariance of the parameter
estimates – see curve_fit.
Is not jacobian an error in the documentation? I would have expected
2013/6/27 Matt Newville <email@example.com>
> I'm pretty baffled by these questions. optimize.leastsq() does not
> take a covariance matrix as input, but can give one as output. It
> can take functions used to compute the Jacobian... Perhaps that would
> accomplish what you're trying to do?
> optimize.curve_fit() is a wrapper around leastsq() for the common case
> of "fitting data" in which one has a set of observations at a set of
> sampled "data points", and a set of variables used in a model for the
> data. Like leastsq(), it returns the covariance. If curve_fit()
> does what you need but seems sup-optimal, than leastsq() is probably
> what you want to use.
> Hope that helps, but maybe I'm not understanding what you're trying to do.
> --Matt Newville
> SciPy-User mailing list
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User