[Numpy-discussion] Another PEP needed for the Numeric community.
oliphant at ee.byu.edu
Sat Oct 8 21:50:10 CDT 2005
Full, 64-bit support in Python is problematic because of the buffer and
sequence protocols which require int's in their interfaces.
Right now, numarray support memory mapped arrays. SciPy can easily
support memory mapped arrays by modifying the same module that numarray
has forged (using the frombuffer function).
However, the module will be limited on 64-bit systems because memory-map
support in Python is limited to 32-bit files even on 64-bit systems.
This was an intentional limitation of the memory map module in Python so
that the sequence and buffer protocols could be supported. However, the
choice to not even allow a 64-bit size memory map to be created seems
wrong. There is no need to go through the sequence and buffer interface
at all times.
If the memory-map module in Python were re-written to inherit from a
big-map object that did not export the buffer and sequence protocols (or
did so in a limited fashion), then all that is needed is for the object
to export it's data pointer and size. Then a function could be written
to create an array in scipy that used the mmap's exported buffer as its
data region. This would allow very big memory mapped arrays without
waiting for Python to fix it's sequence and buffer protocols (which may
not be fixed until Python 3.0).
Thus, a PEP stating how the mmap module should change to support 64-bit
systems is needed.
Another possible project idea for any lurking people desiring to help.
The other possibility is to just copy the nice muliplatform code in the
Python mmap module and use an ndarray as the memory-mapped object.
More information about the Numpy-discussion