[SciPy-user] optimization using fmin_bfgs with gradient information
Wed Jul 22 11:33:32 CDT 2009
20/07/09 @ 10:43 (+0200), thus spake Sebastian Walter:
> thanks for the function :).
> It is now part of pyadolc's unit test.
> , Line 246
> I added two functions: one is providing the gradient by finite
> differences and the other by using automatic differentiation.
> The finite differences gradient has very poor accuracy, only the first
> three digits are correct.
So what your automatic differentiation does when it reaches
a point where the function is not differentiable?
Scipy's optimization.approx_fprime() returns an arbitrarily
large value. For example, let's suppose that f(x) is
In : def f(x):
if x > 2:
then the derivative of f(2) doesn't exist mathematically
speaking. f'(x<2) is 0, and f'(x>2) is 2*x, if I understand
correctly. This is the output of approx_fprime for function f:
In : [opt.approx_fprime((i,),f,eps) for i in numpy.linspace(1.95,2.05,9)]
As you can see, it returns a large number for f'(2).
My question is, for the purposes of optimising f(x), what
should my gradient function return at x=2, so that the
optimisation algorithm works well. I would have said it should
return 0, but seeing what approx_fprime does, I'm not sure any more.
More information about the SciPy-user