[SciPy-dev] Abstract vectors in optimization
Tue Jan 6 07:28:07 CST 2009
Ben FrantzDale wrote:
> Thanks for the reply. Let me describe the problem I've thought most
> about to give you a sense of where I am coming from:
> The optimization problem I have worked most with is atomistic
> simulation (particularly relaxation) in which you have millions of
> atoms all interacting with a force function. The force function is
> sparse but as the atoms move, the sparseness pattern changes (with
> sparseness structures that can be updated automatically when the state
> vector is "moved"). Furthermore, the atoms can be distributed via MPI,
> so while they happen to be a size-3n array of doubles locally, local
> atom 1,234 at one moment may not be the same atom as local atom 1,234
> at the next step. (I don't know the details of PyMPI or how garbage
> collection might interact with MPI, but using SciPy for
> massively-parallel optimization is another motiviation for this post.)
> A common solver to use to relax the state of this sort of system is
> nonlinear CG. However, I've seen many separate implementations of CG
> hand-written to operate on these data structures. As a software
> engineer, this is sad to see; really one nonlinear CG solver should be
> reusable enough that it can be dropped in and used. I tried writing a
> sufficiently-generic CG solver in C++, but designing the right family
> of types, particularly with memory-management issues in mind, got out
> of hand. Given the cost function is the expensive part of the sorts of
> problems I am interested in, the overhead of Python is tiny, and SciPy
> has nice optimization libraries, which lead me to consider
> generalizing SciPy functions.
I think I simply did not understand what you meant by scipy; for me,
scipy means the whole package - and in that case, slicing and other
advanced features of numpy certainly are useful. But it looks like you
had in mind only the optimize package; in that context, Robert's answer
is certainly better than mine.
> I agree with Robert's comment about non-Euclidean metrics, such as
> working with SO(3). Similarly, I have thought it would be interesting
> to optimize function spaces (as in finite-element solvers) in which it
> should be possible to add two functions over the same domain but with
> different discretizations, so that __add__ would implicitly remesh one
> or the other functions rather than having __add__ require the same
> basis functions on the LHS and RHS so it can just add corresponding
> scalar coefficients.
That can definitely be useful, and not only for optimization.
Optimization is not a field I can claim to know a lot about, but those
more abstract arrays with overridable 'core' aspects certainly is useful
in other problems as well. Taking an example nearer to my own field, a
package called lastwave for wavelets has some neat tricks related to
indexing, including automatic interpolation which can be very useful for
signal processing, and is a bit similar to your example of 'remeshing'
if I understand correctly.
More information about the Scipy-dev