[SciPy-Dev] Introduction and feature proposal.
Thu Feb 21 13:28:30 CST 2013
I haven't looked at his code is a good while and it appears to have matured
a lot since then, so my past judgements may no longer apply. In the past
I'd noticed the same thing you had, namely that grid generation and
interpolation was rather slow. Grid generation in particular became very
slow for large dimensional grids. My C/C++ codes for grid generation are
based on algorithms from Sandia's "Dakota" software. At the time they
significantly out performed Klimke's toolbox.
I also had support for hierarchical grids with piecewise linear or
piecewise cubic Hermite interpolants which I don't think Klimke had at the
time (though his thesis featured these types of grids extensively so he may
have included them since).
The 'killer app' in my codes was support for grid generation for
integrating with respect to arbitrary product measures. This was
accomplished by numerically approximating the one dimensional orthogonal
polynomials in each dimension and then creating a sparse grid based on the
derived Gaussian quadrature rules. This currently exists in an evolved form
inside Sandia's "Stokhos" package.
I'd prefer to begin with a fixed isotropic grid types, Clenshaw-Curtis,
Gauss-Legendre, Gauss-Hermite, then move to the anisotropic perhaps with
options to compute the anisotropic weights at runtime. Following that I'd
bring in the adaptive hierarchical grids.
On Thu, Feb 21, 2013 at 10:58 AM, Pablo Winant <email@example.com>wrote:
> On 21/02/2013 14:54, Robert Kern wrote:
> > On Thu, Feb 21, 2013 at 12:30 AM, Christopher Miller
> > <firstname.lastname@example.org> wrote:
> >> Hello,
> >> My name is Chris Miller and I'm a recent applied math PhD from
> >> University of Maryland. I have some matlab and C/C++ code for computing
> >> multi-dimensional integrals using Smolyak sparse grids that I think
> >> might fit well inside scipy.integrate. Sparse grid quadrature delays the
> >> onset of the so called "curse of dimensionality," and is very efficient
> >> for evaluating integrals with a 'moderate' number of variables, say
> >> ~<50. Sparse grids rules can also be tailored to perform integration
> >> with respect to various measures (uniform, Gaussian, beta-distribution,
> >> etc.). I'd appreciate any feedback on whether or not people would find
> >> this useful.
> > That sounds really cool! Yes, please!
> I'm very interested by it too, mostly by interpolation routines. How
> does the code you have compare to the "sparse grid interpolation
> toolbox" by andreas Klimke ? This one has adaptive dimensions and
> several types of grid, but I found it quite slow and not very useful
> when repeatedly evaluate an interpolated function.
> I also have a piece of code in Python for sparse products of Chebychev
> polynomials :
> (sorry to send that link again). It is pure vectorized numpy and I'm
> curious to see how it would compare (have no doubt it is very memory
> SciPy-Dev mailing list
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-Dev