[SciPy-user] fmin won't optimize

Yaroslav Bulatov yaroslavvb at gmail.com
Thu Nov 4 19:33:07 CST 2004


Hi

I'm fitting Gibbs distribution to data using maximum likelihood. If I
use fmin_powell, it works, but when I use fmin, it terminates after 1
iteration regardless of starting point. The log-likelihood function is
convex. Any idea why this would happen?

The code below terminates with likelihood 4.158883, whereas the
minimum is 3.296169

# Train simple 3 parameter Gibbs Distribution

import math
from scipy import *
from scipy.optimize import *

train_set=[(0,0),(0,1),(1,1)]
#test_set=[(0,0),(0,1),(1,1)]
  
# Negative log-likelihood
def neg_ll(lambdas):
  unnormalized = lambda(x):
math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1])
  Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]])
  ll = 0
  for x in train_set:
    ll+=math.log(unnormalized(x))-math.log(Z)
  return -ll

def main():
  x0 = [0,0,0]
  xopt = fmin(neg_ll, x0)
  print 'Before training: %f' %(neg_ll(x0),)
  print 'After training: %f' %(neg_ll(xopt),)
  
if __name__=='__main__': main()



More information about the SciPy-user mailing list