[SciPy-user] Getting started with optimize.fmin_l_bfgs_b
Nils Wagner
nwagner at mecha.uni-stuttgart.de
Wed Jun 29 07:00:55 CDT 2005
Christian Kristukat wrote:
> Nils Wagner wrote:
>
>> Christian Kristukat wrote:
>>
>>> Nils Wagner wrote:
>>>
>>>> Traceback (most recent call last):
>>>> File "bfgs_b.py", line 8, in ?
>>>> best, val, d = optimize.fmin_l_bfgs_b(func, guess, bounds=bounds)
>>>> File
>>>> "/usr/local/lib/python2.4/site-packages/scipy/optimize/lbfgsb.py",
>>>> line 183, in fmin_l_bfgs_b
>>>> f[0], g = func_and_grad(x)
>>>> File
>>>> "/usr/local/lib/python2.4/site-packages/scipy/optimize/lbfgsb.py",
>>>> line 135, in func_and_grad
>>>> f, g = func(x, *args)
>>>> TypeError: unpack non-sequence
>>>>
>>>> How can I resolve the problem ?
>>>
>>>
>>>
>>>
>>> Look at the docstring of l_bfgs_b - func is expected to return the
>>> function and its gradient!
>>>
>>> Regards, Christian
>>>
>>> _______________________________________________
>>> SciPy-user mailing list
>>> SciPy-user at scipy.net
>>> http://www.scipy.net/mailman/listinfo/scipy-user
>>
>>
>>
>>
>> Hi Christian,
>>
>> This my modified program
>>
>> from scipy import *
>>
>> def func(x):
>> return x[0]-x[1]
>>
>> def fprime(x):
>> return array(([1,-1]))
>
>
> It has to be a float array, otherwise l_bgfs_b will complain.
>
>> guess = 1.2, 1.3
>> bounds = [(-2.0,2.0), (-2.0,2.0) ]
>> best, val, d = optimize.fmin_l_bfgs_b(func, guess, fprime,
>> approx_grad=True, bounds=bounds, iprint=2)
>
>
> Now, because of 'approx_grad=True', the gradient is approximated and
> fprime is not evaluated. You should always prefer to calculate the
> gradient analytically, especially if it's as simple as in your example.
>
BTW, are you aware of the paper by Martins
"The complex-step derivative approximation" ACM Trans. Math. Soft. 29
(3) 2003 pp. 245-262
This might be an option instead of using finite differences.
from scipy import *
def func(x):
return x[0]-x[1], array(([1.0,-1.0]))
def fprime(x):
return array(([1.0,-1.0]))
guess = 1.2, 1.3
bounds = [(-2.0,2.0), (-2.0,2.0) ]
#best, val, d = optimize.fmin_l_bfgs_b(func, guess, fprime,
approx_grad=True, bounds=bounds, iprint=2)
best, val, d = optimize.fmin_l_bfgs_b(func, guess, bounds=bounds, iprint=-1)
print 'Position of the minimum',best, 'and its value',val
>> print 'Position of the minimum',best, 'and its value',val
>>
>> Is it somehow possible to visualize (e.g. by matplotlib) the history
>> of the optimization process ?
>
>
> You could collect the values of x at each call of 'func' in a global
> list and plot them later.
>
Please can you send me an example how to realize this task in my case.
Thanks in advance.
Best regards,
Nils
> Regards, Christian
>
>
> _______________________________________________
> SciPy-user mailing list
> SciPy-user at scipy.net
> http://www.scipy.net/mailman/listinfo/scipy-user
More information about the SciPy-user
mailing list