[SciPy-user] optimization advice needed

Neal Becker ndbecker2@gmail....
Sun Jan 27 05:27:40 CST 2008


I have an optimization problem that doesn't quite fit in the usual
framework.

The problem is to minimize the mean-square-error between a sequence of noisy
observations and a model.

Let's suppose there are 2 parameters in the model: (a,b)
So we observe g = f(a,b) + n.

Assume all I know about the problem is it is probably convex.

Now a couple of things are unusual:
1) The problem is not to optimize the estimates (a',b') one time - it is
more of an optimal control problem.  (a,b) are slowly varying, and we want
to continuously refine the estimates.

2) We want an inversion of the usual control.  Rather than having the
optimization algorithm call my function, I need my function to call the
optimization.  Specifically I will generate one _new_ random vector of
observations.  Then I want to perform one iteration of the optimization on
this observation.  (In the past, I have adapted the simplex algorithm to
work this way).

So, any advice on how to proceed?



More information about the SciPy-user mailing list