[SciPy-User] (Possible) new optimization routines - scipy.optimize

Andrea Gavana andrea.gavana@gmail....
Thu Feb 14 16:06:14 CST 2013


On 14 February 2013 22:53,  <josef.pktd@gmail.com> wrote:
> On Thu, Feb 14, 2013 at 4:01 PM, Andrea Gavana <andrea.gavana@gmail.com> wrote:
>> Hi All,
>>
>>     as my team and I are constantly facing very hard/complex numerical
>> optimization problems, I have taken a look at the various *global*
>> optimization routines available in Python and I thought I could throw
>> in a couple of algorithms I implemented, mostly drawing from my
>> previous thesis work.
>>
>> I have implemented two routines (based on numpy and scipy), namely:
>>
>> - AMPGO: Adaptive Memory Programming for Global Optimization: this is my Python
>>   implementation of the algorithm described here:
>>
>>   http://leeds-faculty.colorado.edu/glover/fred%20pubs/416%20-%20AMP%20(TS)%20for%20Constrained%20Global%20Opt%20w%20Lasdon%20et%20al%20.pdf
>>
>>   I have added a few improvements here and there based on my Master Thesis work
>>   on the standard Tunnelling Algorithm of Levy, Montalvo and Gomez.
>
> This could also be a good addition. similar to basinhopping.
> >From my perspective, this kind of global optimizers are the most
> promising, (compared to the evolutionary, ...)
>
> >From a quick browse: Is the local optimizer fixed to a specific one,
> or can it be any available solver as in basinhopping?

It can be changed to be any available solver. As most of the
optimization problems I deal with are horrendously complex (i.e., no
gradient information available), I decided to use BOBYQA as default
local solver inside AMPGO. So the benchmark comparison is fair only
against derivative-free algorithms.

> The only thing I might worry about that it only has 6 citations in
> Google Scholar (which might not mean much if the optimizer is not
> widely available).

I am not sure how relevant the Google Scholar is regarding the
effectiveness of an algorithm: AMPGO was the only method able to crack
two of our latest (real-life) optimization problems (42 and 9
variables respectively), while all the others failed.

> Given that there seem to be many variations of this kind of
> optimizers, it might be good to have some background on comparison
> with similar optimizers.

As far as I know, there are no Open Source implementation of the
Tunnelling Algorithm. I used to have access to a Fortran
implementation (gradient-based only) in the past, but the code belongs
to the National Autonomous University of Mexico. There are no Python
implementation of it. AMPGO is a short-memory tunnelling algorithm
loosely connected with the standard tunnelling method.

> If you have other comparisons of similar optimizers, it would be
> useful to see them. Also given that you have a large benchmark suite,
> you could compare it with the new basinhopping in scipy.optimize.

I'll give it a try tomorrow and report back. I still don't have scipy
0.11 at work (they won't let me upgrade), but I guess I can just get
the Python source code for it and run it, can't I?

>> - Firefly:  the Firefly algorithm, this is my Python implementation of
>> the procedure
>>   described here:
>>
>>   http://www.mathworks.com/matlabcentral/fileexchange/29693-firefly-algorithm
>
> the fileexchange has a large number of "animal" optimizers, and I
> doubt they are all good.
>
> I think there needs to be a convincing case before adding any of them
> should be added to scipy.
> On the other hand, having them available outside of scipy would make
> it easier to try them out.
> (and see if fireflies, or bees or ants are doing better :)

I agree. I did it as an exercise to see if I remembered my Matlab
after 10 years of using Numpy/Scipy only :-)


Andrea.

"Imagination Is The Only Weapon In The War Against Reality."
http://www.infinity77.net


More information about the SciPy-User mailing list