[SciPy-user] optimization using fmin_bfgs with gradient information

Sebastian Walter sebastian.walter@gmail....
Sat Jul 18 11:36:54 CDT 2009


I don't find it so hard to believe that you got your gradient function wrong.
Could you post the code of your objective function?

Maybe you've just got the wrong sign. Near the optimum this would be OK,
but trying to do a descent step away from the optimizer is to fail.

Just a wild guess,
Sebastian



2009/7/18 Ernest Adrogué <eadrogue@gmx.net>:
> Hi,
>
> I'm using optimize.fmin_bfgs to find the minimum of a function.
> It works, but I'd like to speed it up by supplying the gradient
> of the function to fmin_bfgs.
>
> Without supplying the gradient I get these results:
>
> Warning: Desired error not necessarily achieveddue to precision loss
>         Current function value: 638.939214
>         Iterations: 73
>         Function evaluations: 4387
>         Gradient evaluations: 102
>
> The output of my gradient function evaluated at xopt:
>
> [ -1.26071352e-06   2.22057130e-06   9.10389060e-06  -3.47809758e-06
>   5.26179023e-06  -5.90267183e-06  -3.19019368e-06  -7.39985613e-06
>  -2.84634204e-06   3.84543574e-07   1.33341847e-06  -1.59029471e-06
>   5.13325055e-06  -3.53840419e-06   2.23408274e-06   1.05588332e-05
>   1.04574907e-05  -2.46512209e-06  -2.54991167e-07   1.24356893e-06
>  -9.28475141e-06  -2.76441219e-07  -2.81902992e-06   7.59715257e-08
>  -4.61241275e-07  -1.57030283e-06   4.43909204e-06   6.66069772e-08
>  -1.64478684e-06   4.03578664e-06   6.81269187e-07   9.74726616e-06
>  -5.92372950e-06   7.85634341e-06  -1.48669281e-06   2.67525449e-07
>   3.50545615e-07   1.44128199e-06   2.71466860e-06   4.23270815e-06
>  -2.20851113e-05]
>
> The output of check_grad at xopt: 0.000106523006716
>
> So, apparently it looks like my gradient function is correct,
> doesn't it?  However, when I pass the gradient to fmin_bfgs I get
> this:
>
> Warning: Desired error not necessarily achieveddue to precision loss
>         Current function value: 653.494345
>         Iterations: 5
>         Function evaluations: 45
>         Gradient evaluations: 44
>
> Notice that the minimum is higher now.
>
> My gradient evaluated at xopt is far from zero this time:
>
> [  5.99210031   7.78372931   2.17685535   8.62438169   6.78737246
>   4.59089064   6.28766488   3.74376886   5.21582577   1.20448784
>   0.26857912   5.17257475   5.32668068   8.14539521   3.21022361
>   5.87014267   5.14406772   6.26400519   4.35807008   5.20230664
>   1.32962472   6.05407954   2.85062903   5.29204265  10.4366293
>   1.78770855  -2.22449411   5.20648252   4.05410094   6.64206808
>   2.19202177   5.33385709   5.30404265   3.73158178   4.44347609
>   4.38591199   3.12390498   7.01723668   3.93901794   6.31246349
>   3.61374379]
>
> And check_grad at xopt says: 34.3464575331
>
> I can't figure out what's going on. From the output of
> check_grad, it seems that my gradient function calculates the
> gradient rightly at one point and wrongly at another point.
> Is that correct?
>
> Ernest
> _______________________________________________
> SciPy-user mailing list
> SciPy-user@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>


More information about the SciPy-user mailing list