[SciPy-user] Memory leak in delaunay interpolator
Wed Jan 16 04:38:02 CST 2008
I'm not sure who else uses the delaunay package (was in
scipy.sandbox, now lives in scikits), but I find it indispensable.
Today I found what appears to be a memory leak in the interpolator
and extrapolator objects. This simple code demonstrates the leak:
from scipy.sandbox import delaunay # or wherever your delaunay
package lives these days
from numpy.random import rand
xi, yi = rand(2, 1000)
x, y = rand(2, 100)
tri = delaunay.Triangulation(xi, yi)
for n in range(100000):
interp = tri.nn_interpolator(rand(1000))
z = interp(x, y)
I tested this code on Mac OS X 10.4, and a recent version of Ubuntu.
Both show memory usage increasing consistently through the run.
Also, while I am here, does anybody have any other advice on 2D
interpolation? 3D? This is something I need to do often, and I am
still waiting for the perfect solution to come along.
Rob Hetland, Associate Professor
Dept. of Oceanography, Texas A&M University
phone: 979-458-0096, fax: 979-845-6331
More information about the SciPy-user