[SciPy-dev] one more scipy.optimize.line_search question
dmitrey
openopt@ukr....
Mon Aug 6 15:44:33 CDT 2007
Alan G Isaac wrote:
> On Mon, 06 Aug 2007, dmitrey apparently wrote:
>
>> I wonder what does the parameter amax=50 mean (in
>> optimize.line_search func)? Seems like this parameter is
>> never used in the func
>>
>
> Is there something wrong with the minpack2 documentation?
>
> c stpmax is a double precision variable.
> c On entry stpmax is a nonnegative upper bound for the step.
> c On exit stpmax is unchanged.
>
the line_search func from scipy.optimize has no any relation to the
fortran routine you have mentioned.
Maybe you mean line_search func from scipy.optimize.linesearch, but as I
mentioned, scipy has 2 funcs line_search, one python-written (from
optimize.py), other from /optimize/linesearch.py.
So, as I have mentioned, amax is unused in scipy.optimize.linesearch
>
>
>> moreover, amax is defined in scipy.optimize module as
>> a func
>>
>
> I think you are confusing this with the numpy array method?
>
>
>
>> Also, in the middle of the func it has line
>> (optimize.py)
>> maxiter = 10
>> this one seems to be very small to me.
>> don't you think it's better to handle the param in input args?
>>
>
> That is just for the bracketing phase.
> Are any troubles resulting from this value?
>
I have troubles all over the func. Seems like sometimes Matthieu's func
that declines same goal (finding x that satisfies strong Wolfe
conditions) works much more better (in 1st iteration, but makes my CPU
hanging on in 2nd iter), but in other cases scipy.optimize provides at
least some decrease, while Matthieu's - not (as i mentioned above, also,
sometimes Matthieu's func return same x0 after 2nd iter and makes my alg
stop very far from x_optim).
1) this test, Matthieu:
itn 0: Fk= 8596.39550577 maxResidual= 804.031293334
11
(between these lines I call Matthieu's optimizer)
22
------------------
xnew:
objfun: 227.538805239
max residual: 2.17603712827e-12 (because now I use test where only
linear inequalities are present)
11
(CPU hanging on)
2) Same test, scipy.optimize:
itn 0: Fk= 8596.39550577 maxResidual= 804.031293334
itn 100 : Fk= 8141.04226717 maxResidual= 784.640128722
itn 200 : Fk= 7708.62739973 maxResidual= 765.716680829
...
as you see, objFun decrease, as well as max constraint decrease, is very
slow:
If I provide gradient of my func numerically obtaining, nothing changes
except of some calculation speed decrease.
I tried to modify sigma (Matthieu notation) = c2 (scipy notation) =
0.1...0.9, almost nothing changes
(Matthieu's default val = 0.4, scipy - 0.9, as it is in
http://en.wikipedia.org/wiki/Wolfe_conditions)
afaik c1 is default 0.0001 in both mentioned.
So now I'm trying to learn where's the problem.
Regards, D.
>
>
>> Also, don't you think that having 2 line_search funcs is ambiguous?
>> (I mean one in scipy.optimize, python-written, and one in
>> scipy.optimize.linesearch, binding to minpack2)
>>
>
> It would be nice to have some history on this.
> I expect we'll have to wait until Travis has time to look
> this discussion over, which may not be soon.
>
> Cheers,
> Alan Isaac
>
>
> _______________________________________________
> Scipy-dev mailing list
> Scipy-dev@scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-dev
>
>
>
>
More information about the Scipy-dev
mailing list