[SciPy-user] Linear regression with constraints
dmitrey
dmitrey.kroshko@scipy....
Wed May 7 12:36:42 CDT 2008
As far as I had seen from the example
(btw this one has analytical df)
http://projects.scipy.org/scipy/scikits/browser/trunk/openopt/scikits/openopt/examples/llsp_2.py
specialized solvers (like bvls) yield better results (objfunc value)
than scipy l_bfgs_b and tnc.
You could try using algencan, I had no checked the one (my OO<->
algencan connection was broken that time). It will hardly yield better
objfunc value, but maybe it will have less time spent for some
large-scale problems.
Regards, D.
Jose Luis Gomez Dans wrote:
> Hi,
> I have a set of data (x_i,y_i), and would like to carry out a linear regression using least squares. Further, the slope and intercept are bound (they have to be between 0 and slope_max and 0 and slope_min, respectively).
>
> I have though of using one of the "easy to remember" :D optimization methods in scipy that allow boundaries (BFGS, for example). i can write the equation for the slope and intercept based on x_i and y_i, but I gather that I must provide a gradient estimate of the function at the point of evaluation. How does one go about this? Is this a 2-element array of grad(L) at m_eval, c_eval?
>
> Thanks!
> Jose
>
More information about the SciPy-user
mailing list