[Numpy-discussion] Optimized half-sizing of images?
Stéfan van der Walt
Thu Aug 6 15:23:55 CDT 2009
2009/8/6 Christopher Barker <Chris.Barker@noaa.gov>:
> Can I do better? It seems it should be faster if I can avoid so many
> separate loops through the array. I figure there may be some way with
> filter or convolve or ndimage, but they all seem to return an array the
> same size.
Are you willing to depend on SciPy? We've got pretty fast zooming
code in ndimage.
If speed is a big issue, I'd consider using the GPU, which was made
for this sort of down-sampling.
More information about the NumPy-Discussion