[SciPy-user] Fwd: [sage-devel] Fwd: (Summer Of Code) are you interested in numerical optimization (+solving non-smooth non-linear systems of equations)?
Thu Feb 22 14:28:33 CST 2007
[ Forwarded from the SAGE dev list, since this may well be right up
someone's alley from the scipy crowd]
---------- Forwarded message ----------
From: William Stein <firstname.lastname@example.org>
Date: Feb 22, 2007 1:00 PM
Subject: [sage-devel] Fwd: (Summer Of Code) are you interested in
numerical optimization (+solving non-smooth non-linear systems of
To: email@example.com, firstname.lastname@example.org
Does anybody have any thoughts about this potentially excellent idea
of a package that could help SAGE?
---------- Forwarded message ----------
From: dmitrey <email@example.com>
Date: Feb 22, 2007 1:56 AM
Subject: (Summer Of Code) are you interested in numerical optimization
(+solving non-smooth non-linear systems of equations)?
Hallo William Stein!
I found your email address in
As far as I understood there are very few constrained solvers for
Python. In google I failed to find anything but the CVXOPT, which consists
mostly of wrappers to commercial mosek, and some LP/MIP wrappers to GNU
C- or f-
code (and some optimization routines from scipy, of course).
So I'm last-year post-graduate (institute of cybernetics, Ukraine national
science academy, optimization department). Our department research
methods of optimization for non-smooth (& noisy) funcs since 1964 or so
(under leadership of academician Naum
Z. Shor till 2002 when he gone away), and parallel department under
of academician I. Sergienko & dr. V.Shilo researches combinatory
problems (something like matlab bintprog, MAXCUT etc, some weeks ago they
published article of their GRASP-based code that won comparison vs CPLEX
other commercial solvers).
So all our software is opensourse & free, mostly fortran & C written.
3-4 months ago I began to write (in m-files) OpenOpt for MATLAB/Octave
0.15 had been reliesed in November 25, 2006). It's equivalent to commercial
TOMLAB or GAMS (currently puny of course, but free (GNU GPL2)) and
contains 4 global solvers, that I connected to the OpenOpt environment
matworks fileexchange, 2 my local nonsmooth solvers - ShorEllipsoid for
1...10 & ralg for nVars = 1...1000, and nonSmoothSolve - MATLAB fsolve
equivalent for non-smooth & noisy funcs. There is a good comparison in
Examples/nonSmoothSolveEx.m, which shows, that fsolve fails even on
low-non-smooth funcs, but nonSmoothSolve - not.
ralg & ShorEllipsoid are MATLAB equivalents to fminsearch; however, they
can handle (as long as nonSmoothSolve):
h(x)=0; %as far as I understood from CVXOPT documentation it can't
handle these constraints - see
as long as gradients or subgradients df, dc, dh of f, c, h.
MATLAB fminsearch can't handle anything mentioned above, and in 95% it
comparison to ralg (but I can't say I tried too much examples). Just
x0 = cos(1:60).';
or Lemarechal.m from OpenOpt/test (convex, continuous, non-smooth)
These and some more examples are in OpenOpt/Examples, see ooexample5.m,
ooexample2.m and others. This directory also contains some pictures of
convergence, automatically generated by the files.
Also OpenOpt performs auto scaling (but I not tested properly it yet);
patterns of f, c, h (when no (sub)gradient is provided by user) can greatly
speedup calculations; possibility of parallel calculations while
numerically (via MATLAB dfeval, Octave users must provide similar func in
prob.parallel.fun) and some more features.
So are you interested in Python version of the OpenOpt? If yes, I
be able to contact with other Kiev institute of System Analis, where a
under leadership of academician Pshenichniy (perished some months ago) &
Nikitin develop smooth algorithms of optimization, and their IP-based
solvers (constrained, of course; including 2nd-order solvers) are
considered to be one of essential. So I probably
would be able to write for you Pyhton OpenOpt ver (GNU GPL2) with
essential equivalent of MATLAB fmincon, as long as ralg, ShorEllipsoid,
bintprog; drawing pictures of convergence, using patterns of
dependences, parallel obtaining of (sub)gradients to f(), c(), h();
network problems solvers etc.
If you have any questions, or can add any financial support, or want to
my CV (I had about 1-1.5 years of Python experience & 3-4 years of
MATLAB), or anything else - you can contact me via email or icq 275 -
976 - 670 (invisible).
The more financial support can be obtained, the more time can I spend
for the OpenOpt Python version, & the more icyb optim department workers
can be involved into the OpenOpt for Python development (any salary
amount are essential to the Ukraine workers).
If I gain enough money, I propose:
creating the same environment for Python as it is done for
MATLAB/Octave; (1-1.5 months)
writing ralg() & ShorEllipsoid() solvers (Unconstrained: ~1 week,
constrained: +2-3 weeks)
writing nonSmoothSolve() : ~ 1-2 weeks
writing MATLAB bintprog (f*x->min, A*x<=b, Aeq*x=beq) equivalent based
on rd. Shilo (& others) version of GRASP (works better than current
CPLEX!): ~2 weeks
writing MATLAB fmincon equivalent (smooth constrained optimization,
c(x)<=0, h(x)=0, linear constraints +1st & 2nd derivatives) based on
works of dr. Nikitin & academician Pshenichniy: (I can't estimate time
for now, I must contact to him before)
+ following implementation of other solvers, their upgrade & maintenance.
I guess in future you could easily connect the py-code to the SAGE
project, for example like you did with MAXIMA package.
ftp where OpenOpt versions are storing:
(you would better to download OpenOpt0.36.tar.bz2 from here)
my page at matlabexchange area
Let me attach a graphic file generated by the OpenOpt ooexample3.m
Associate Professor of Mathematics
University of Washington
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to
For more options, visit this group at
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
More information about the SciPy-user