[SciPy-user] Getting started with optimize.fmin_l_bfgs_b
ckkart at hoc.net
Wed Jun 29 06:46:22 CDT 2005
Nils Wagner wrote:
> Christian Kristukat wrote:
>> Nils Wagner wrote:
>>> Traceback (most recent call last):
>>> File "bfgs_b.py", line 8, in ?
>>> best, val, d = optimize.fmin_l_bfgs_b(func, guess, bounds=bounds)
>>> line 183, in fmin_l_bfgs_b
>>> f, g = func_and_grad(x)
>>> line 135, in func_and_grad
>>> f, g = func(x, *args)
>>> TypeError: unpack non-sequence
>>> How can I resolve the problem ?
>> Look at the docstring of l_bfgs_b - func is expected to return the
>> function and its gradient!
>> Regards, Christian
>> SciPy-user mailing list
>> SciPy-user at scipy.net
> Hi Christian,
> This my modified program
> from scipy import *
> def func(x):
> return x-x
> def fprime(x):
> return array(([1,-1]))
It has to be a float array, otherwise l_bgfs_b will complain.
> guess = 1.2, 1.3
> bounds = [(-2.0,2.0), (-2.0,2.0) ]
> best, val, d = optimize.fmin_l_bfgs_b(func, guess, fprime,
> approx_grad=True, bounds=bounds, iprint=2)
Now, because of 'approx_grad=True', the gradient is approximated and fprime is
not evaluated. You should always prefer to calculate the gradient analytically,
especially if it's as simple as in your example.
> print 'Position of the minimum',best, 'and its value',val
> Is it somehow possible to visualize (e.g. by matplotlib) the history of
> the optimization process ?
You could collect the values of x at each call of 'func' in a global list and
plot them later.
More information about the SciPy-user