[SciPy-dev] ticket 390 ("scipy.optimize.fmin_bfgs fails without warning", Reported by: ycopin)

dmitrey openopt@ukr....
Wed Jul 18 08:50:03 CDT 2007


On Wed, 18 Jul 2007, Yannick Copin apparently wrote:
>> I actually did not check formally if the solution found is 
>> indeed a local minimum, but that would surprise me. So 
>> indeed the problem is not so much that the algo could fall 
>> on some secondary minima, but that it claims convergence 
>> while the requirements for convergence are probably not 
>> met (are they in this special case?)
>>     
I think it's better for you to insist that they are not met, then for us 
to investigate your objfunc are stop criteria met or not met, it's 
rather time-eating and it's not our duty. I suppose you should report 
bug if you are sure you that it's not your own.
(please don't see an offense here)

Using solver that relies on gradient in your case is already mistake. 
Lots of (local) solvers are just *enable* to found *any* minimum of 
sin(1000*x), for example, Naum Z.Shor ralg implementation. They will 
fail to solve line-search subproblem from certain points. So you can't 
demand from local solver to obtain ANY solution, local or global (in the 
case of non-convex func with lots of local minima).

I can't understand, since you have just nVars = 3, why don't you use a 
global solver, like anneal?
Regards, D.



More information about the Scipy-dev mailing list