[Numpy-discussion] ***[Possible UCE]*** Bug in memmap/python allocation code?
oliphant.travis at ieee.org
Tue Jul 25 01:47:06 CDT 2006
Mike Ressler wrote:
> I'm trying to work with memmaps on very large files, i.e. > 2 GB, up
> to 10 GB. The files are data cubes of images (my largest is
> 1290(x)x1024(y)x2011(z)) and my immediate task is to strip the data
> from 32-bits down to 16, and to rearrange some of the data on a
> per-xy-plane basis. I'm running this on a Fedora Core 5 64-bit system,
> with python-2.5b2 (that I believe I compiled in 64-bit mode) and
> numpy-1.0b1. The disk has 324 GB free space.
I just discovered the problem. All the places where
PyObject_As<Read/Write>Buffer is used needs to have the final argument
changed to Py_ssize_t (which in arrayobject.h is defined as int if you
are using less than Python 2.5).
This should be fixed in SVN shortly....
More information about the Numpy-discussion