[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Matthieu Brucher matthieu.brucher@gmail....
Wed Apr 18 01:36:00 CDT 2007


2007/4/18, Michael McNeil Forbes <mforbes@physics.ubc.ca>:
>
> Okay, I think we are thinking similar things with different terminology:



Yes, I think that too.


I think you are saying that only one object should maintain state
> (your "optimizer") (I was originally sharing the state which I agree
> can cause problems).  If so, I agree, but to me it seems that object
> should be called a "partially optimized function".  I think of an
> "optimizer" as something which modifies state rather than something
> that maintains state.



Well, in fact the real object that has a state is the step, the optimizer
could have a state, but I do not currently use that approach. I'll think
about the dependencies that having an state optimizer only would it lead to.

That would mean that each call to the step, the criterion or the line search
would take another parameter, the state of the optimizer. Let say it's a
dict. Each object would take and modify some values, and in fact that is
what I pass to the recordHistory function - I think I'll make it again
record_history to be scipy coding standard compliant -, so there are not
much trouble to do this.


  I am thinking of code like:
>
> ------
> roughly_locate_minimum = Optimizer
> (criterion=extremelyWeakCriterion,step=slowRobustStep,...)
> find_precise_minimum = Optimizer
> (criterion=preciseCriterion,step=fasterStep,...)
>
> f = Rosenbrock(...)
> x0 = ...
>
> f_min = OptimizedFunction(f,x0)
> f_min.optimize(optimizer=roughly_locate_minimum)
> f_min.optimize(optimizer=find_precise_minimum)
>
> #OR (this reads better to me, but the functions should return copies
> of f_min, so may not be desirable for performance reasons)
> f_min = roughly_locate_minimum(f_min)
> f_min = find_precise_minimum(f_min)
>
> # Then one can query f_min for results:
> print f_min.x   # Best current approximation to optimum
> print f_min.f
> print f_min.err #Estimate error
> print f_min.df
> # etc...
> -----
>
> The f_min object keeps track of all state, can be passed from one
> optimizer to another, etc.  In my mind, it is simply an object that
> has accumulated information about a function.



You mean you would want to possibly share the state between optimizers ?


  The idea I have in
> mind is that f is extremely expensive to compute, thus the object
> with state f_min accumulates more and more information as it goes
> along.



Well, a part of this is done my the recordHistory, but I don't think that
saving every whole state in f_min is a good idea from a memory point of
view. Why not saving the last state, with every needed values ?
For instance, the old and new values, the old and new parameters, the old
and new step, the new gradient, ... I think the number iterations should be
there as well ;)


  Ultimately this information could be used in many ways, for
> example:
>
> - f_min could keep track of roughly how long it takes to compute f
> (x), thus providing estimates of the time required to complete a
> calculation.
> - f_min could keep track of values and use interpolation to provide
> fast guesses etc.
>
> Does this mesh with your idea of an "optimizer"?  I think it is
> strictly equivalent, but looking at the line of code
> "optimizer.optimize()" is much less useful to me than "f_min.optimize
> (optimizer=...)".
>
> What would your ideal "user" code look like for the above use-case?



Well, not exactly, it would be almost like it. I do not know what you want
to put in OptimizedFunction and what is its role exactly.
My ideal code is straightforward - in fact it's more the current code - :
f = Rosenbrock(...)
x0 = ...

roughly_locate_minimum_optimizer = StandardOptimizer(function = f, x0 = x0,
step = Step.SomeStep(...), lineSearch = LineSearch.InexactLineSearch(...),
criterion = Criterion.SomeCriterion(...), record =
SomeRecordingFunctionIfNeeded)
local_minimum = roughly_locate_minimum_optimizer.optimize()
precisely_locate_minimum_optimizer = StandardOptimizer(function = f, x0 =
local_minimum, step = Step.SomeOtherStep(...), lineSearch =
LineSearch.ExactLineSearch(...), criterion = Criterion.SomeOtherCriterion(...),
record = SomeRecordingFunctionIfNeeded)
minimum = precisely_locate_minimum_optimizer.optimize()


Using the OptimizedFunction to save a state shared by different optimizers
that do not save the same things could lead to borders effects difficult to
track. What could be done, as I said, is to output the state at the end of
the optimizer, and perhaps allowing the user to give it to a new optimizer.
It would only take the part that it needs.


I will try to flesh out a more detailed structure for the
> OptimizedFunction class,
> Michael.


I'm looking foward to see this ;)

Matthieu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-dev/attachments/20070418/856db5ac/attachment-0001.html 


More information about the Scipy-dev mailing list