[SciPy-user] PyDSTool and SciPy (integrators)
Rob Clewley
rob.clewley@gmail....
Thu Apr 17 21:02:56 CDT 2008
On Thu, Apr 17, 2008 at 8:28 PM, Gabriel Gellner <ggellner@uoguelph.ca> wrote:
> > The pieces that look most useful to me for inclusion in SciPy are the
> > integrators, as you say. But it's not clear to me why I should prefer
> > dopri853 or radau5 to the Adams integrator built in to scipy. Are
> > somehow better for all functions? Take better advantage of the
> > smoothness of objective functions? More robust in the face of stiff or
> > partially-stiff systems?
Unfortunately, I'm not an expert in the numerical analysis associated
with these methods, but I know that Radau in particular is a very good
stiff integrator. Also, Radau lets you solve systems of the form
M(x,t) dx/dt = f(x,t) where M is a non-invertible mass matrix (we have
actually extended this feature to allow state and time dependence in
M). My colleague Erik hopes to add other Hairer-Wanner integrators
(e.g. one for delay-differential equations) at some point, using a
very similar interface to the one we have already developed.
Beyond that, we should both read Hairer and Wanner's books in detail!
I haven't managed much of them yet. I finally bought them a few months
ago.
> > What I have needed is an integrator that offers more control. The
> > particular problem I ran into was one where I needed to be able to set
> > stopping conditions based on the dependent variables, not least
> > because my model stopped making sense at a certain point. I ended up
> > wrapping LSODAR from ODEPACK, and it made my life vastly easier. If
> > you have integrators that can provide that kind of control, that would
> > be very useful.
Right, that's what we have.
> > Similarly something like a trajectory object that
> > knows enough about the integrator to provide a spline with the right
> > derivatives and knot spacing would be handy.
Well, we created the Trajectory class for just such a purpose at the
python level. You call the trajectory object as if it was a
continuously-defined function. As a prototype, it only does linear
interpolation between points, but the whole idea is that the "density"
of points can be implemented with any kind of spline if you replace
that modular part (in the Variable class).
> The dopri5 and dop853 codes have a really good interface for this, that is
> after each successful integration step they call a solution callback that
> allows for dense interpolation of the solution between stepping points (with
> error of the same order as the solution), and any kind of stopping or event
> detection routines. The hard part is thinking of good ways of making default
> callbacks that are extensible, but it is nice that the base codes were
> designed with this functionality.
We don't currently pass on information about the dense output to the
python level. I think that could be worked out, but would obviously
require a huge amount of extra memory, but Erik would know better
about this prospect because he wrote the interfaces to PyDSTool.
Ultimately, a better solution would be a proper implementation of a
Taylor-series based integrator that is extremely accurate!
> Ultimately with my Cython wrapper you could just pass in an optional callback
> as well (coded in C or Cython as otherwise it would be painfully slow), that
> would give you any kind of control you might want. Though the way forward is
> to have good default templates available, I have been looking at PyDSTool's
> method as well as how matlab (not the code, rather the api) deals with this
> control.
I would like to understand using cython in this way -- I'm very
unfamiliar with it. Would you maybe share that wrapper with us so that
we can look into it as a possible replacement for SWIG? Can you
comment on how efficient it is in reducing the number of interfacing
calls between python/numpy and C? I heard that for Sage this was a big
improvement.
> If you now of any other good examples of event detection or stopping rule
> api's I would love to look at them!
Our user-defined event detection functions get created at the C-level
for extra speed. IIRC, the stopping algorithm does not involve an
intelligent look-ahead, rather it just detects a sign change during a
step, and then uses a root solver (bisection, I think) on the "dense"
solution function.
> Another side benefit is that we can make the codes work identically to ode45
> from matlab, as they are using the same algorithm . . . which may not be
> better, but might help some users that seem to want these codes for
> compatibility.
I can't say I like the Matlab APIs, but if people want to make
multiple exposures of the same underlying integrator API to support
this, maybe that has its benefits for some.
> > Your scheme for producing compiled objective functions appeals less as
> > a component of scipy. It's definitely useful as a part of PyDSTool,
> > and many people working with ODEs will want to take advantage of it,
> > but I foresee problems with incorporating it in scipy. For one thing
> > it requires that the user have a C compiler installed and configured
> > in the right way, and it requires that the system figure out how to
> > call it appropriately at runtime.
I don't really see why it would be a problem, given that supporting
VODE is already an example of doing this in scipy. There are two
aspects to a solution there. The integrators can be ready-compiled
just as VODE is right now for use in binary distros, and someone could
add python vector field callback functionality to them using cython or
whatever (as Gael has apparently already done). But there would be an
option so that users with a working compiler can use it!
> > This seems like it's going to make
> > installation significantly more tricky; already installation is one of
> > the big stumbling blocks for scipy.
I don't see how there'd be a change to the installability of scipy as
a result. We (ab-)used distutils to do the compilation precisely
because it provides a platform independent way of doing so -- it works
right out of the box if you have a compiler installed. Cleaning that
up is one of the things we'd like help with.
> > A second problem is that
> > presumably it places restrictions on the right-hand-side functions
> > that one can use. In some of the cases I have used odeint, the RHS has
> > been quite complicated, passing through methods of several objects,
> > calling various functions from scipy.special, and so on. An integrator
> > that can't deal with this sort of RHS is going to be a specialized
> > tool (and presumably much faster within its domain of applicability).
This is true to some extent, but I already added support for
user-defined auxiliary functions of state variables and time (they can
also be nested) and for a range of special functions that are easily
accessible from their C math library equivalents. As for a v.f.
calculation passing through object methods: sure, that's not going to
be so easy -- but that would just be another reason to add a python
v.f. callback facility as an option. I've never needed to do that in
my work. I don't know what you're using it for, but if your code is
making non-smooth changes to the vector field (e.g. through
conditional statements) we support proper hybrid systems models based
on our integrators -- otherwise it's not safe to be changing the v.f.
non-smoothly as you'll confuse the integrator and make it spit out
inaccurate points or fail to converge.
> Is not the sage approach of using either optional Cython callbacks, or what I
> often do, using f2py callbacks, a good solution? That way you really don't need
> to change the api's, and the availability of compilers is only needed in an
> opt in bases (kind of like when using mex files in matlab).
If this means the user can choose whether to supply a python v.f.
function or ask for the system to build C code with more-or-less the
same scipy-level interface, then yes, it would seem like the ideal
solution.
> > In summary, I think that the most useful additions to scipy's
> > differential equation integration would be:
> > * A good model for solving ODEs and representing their results
> > * Ability to set stopping conditions and additional functions to be computed
> >
> What do people think about returning an object from the integrators? I have
> been using R a lot lately and I really like how they use simple data objects
> to represent the output of solutions. That way you could have an object that
> would work like an array of the solution points, but would also have the full
> output available as attributes (like the time vector, even locations, etc).
> That way, just like in R, the complexity of the solution would be hidden, but
> it gets rid of the awkward use of 'full_output' booleans, and dealing with the
> resulting tuples (which seems like a suboptimal attempt to get matlab like
> functionality with out the syntax that matlab gives for only taking the
> information you need).
Our Trajectory (and embedded Variable) objects go some way towards
this already, which is why I'm touting not just the integrators from
PyDSTool as potentially useful classes for scipy. They could be easily
extended to hold more information from the integrator in the way you
suggest. The time data is already built into these objects, as is
domain restriction/bounds checking. Also, I personally find the
index-free nature of these objects and their ability to spit out
index-free Point and Pointset objects when called at arbitrary time
values (including a vectorized version for calling with arrays of time
values) extremely liberating when using the integrators inside other
code -- I hate keeping track of indices for different types of vector.
The underlying raw output data from the integrator can also be
extracted to bypass any interpolation.
-Rob
More information about the SciPy-user
mailing list