[SciPy-user] PyDSTool and SciPy (integrators)

Gabriel Gellner ggellner@uoguelph...
Thu Apr 17 19:28:13 CDT 2008

> The pieces that look most useful to me for inclusion in SciPy are the
> integrators, as you say. But it's not clear to me why I should prefer
> dopri853 or radau5 to the Adams integrator built in to scipy. Are
> somehow better for all functions? Take better advantage of the
> smoothness of objective functions? More robust in the face of stiff or
> partially-stiff systems?
Any information on this would be really interesting! I have ordered the Hairer
book, but a quick overview of the benefits of the multistep vs runge kutta
methods for use in scipy would be greatly appreciated. My feeling from my
wrapper is that the multistep seem to be a bit faster when using python
callbacks as they trade off increased memory usage and lack of easy self
starting, for fewer function calls (which in this case are slow python calls).
But the tradeoff might be for the next point, ie easy ways to stop the
integrator . . . again any information from someone who knows better would 
be greatly appreciated.

> What I have needed is an integrator that offers more control. The
> particular problem I ran into was one where I needed to be able to set
> stopping conditions based on the dependent variables, not least
> because my model stopped making sense at a certain point. I ended up
> wrapping LSODAR from ODEPACK, and it made my life vastly easier. If
> you have integrators that can provide that kind of control, that would
> be very useful. Similarly something like a trajectory object that
> knows enough about the integrator to provide a spline with the right
> derivatives and knot spacing would be handy.
The dopri5 and dop853 codes have a really good interface for this, that is
after each successful integration step they call a solution callback that
allows for dense interpolation of the solution between stepping points (with
error of the same order as the solution), and any kind of stopping or event
detection routines. The hard part is thinking of good ways of making default
callbacks that are extensible, but it is nice that the base codes were
designed with this functionality. 

Ultimately with my Cython wrapper you could just pass in an optional callback
as well (coded in C or Cython as otherwise it would be painfully slow), that
would give you any kind of control you might want. Though the way forward is
to have good default templates available, I have been looking at PyDSTool's
method as well as how matlab (not the code, rather the api) deals with this

If you now of any other good examples of event detection or stopping rule
api's I would love to look at them!

Another side benefit is that we can make the codes work identically to ode45
from matlab, as they are using the same algorithm . . . which may not be
better, but might help some users that seem to want these codes for

> Your scheme for producing compiled objective functions appeals less as
> a component of scipy. It's definitely useful as a part of PyDSTool,
> and many people working with ODEs will want to take advantage of it,
> but I foresee problems with incorporating it in scipy. For one thing
> it requires that the user have a C compiler installed and configured
> in the right way, and it requires that the system figure out how to
> call it appropriately at runtime. This seems like it's going to make
> installation significantly more tricky; already installation is one of
> the big stumbling blocks for scipy. A second problem is that
> presumably it places restrictions on the right-hand-side functions
> that one can use. In some of the cases I have used odeint, the RHS has
> been quite complicated, passing through methods of several objects,
> calling various functions from scipy.special, and so on. An integrator
> that can't deal with this sort of RHS is going to be a specialized
> tool (and presumably much faster within its domain of applicability).
> On the other hand, the general problem of making callbacks run faster
> turns up in a number of places in scipy - the optimization routines,
> for example, have the same problem. A general approach to solving this
> problem would be very valuable.
Is not the sage approach of using either optional Cython callbacks, or what I
often do, using f2py callbacks, a good solution? That way you really don't need
to change the api's, and the availability of compilers is only needed in an
opt in bases (kind of like when using mex files in matlab).

> In summary, I think that the most useful additions to scipy's
> differential equation integration would be:
> * A good model for solving ODEs and representing their results
> * Ability to set stopping conditions and additional functions to be computed
What do people think about returning an object from the integrators? I have
been using R a lot lately and I really like how they use simple data objects
to represent the output of solutions. That way you could have an object that
would work like an array of the solution points, but would also have the full
output available as attributes (like the time vector, even locations, etc).
That way, just like in R, the complexity of the solution would be hidden, but
it gets rid of the awkward use of 'full_output' booleans, and dealing with the
resulting tuples (which seems like a suboptimal attempt to get matlab like
functionality with out the syntax that matlab gives for only taking the
information you need).

Anyway, just my thoughts. I am very excited about progress in the ode tools
for python!


More information about the SciPy-user mailing list