[Numpy-discussion] Huge arrays
Wed Sep 9 00:22:33 CDT 2009
On Wed, Sep 9, 2009 at 2:10 PM, Sebastian Haase<firstname.lastname@example.org> wrote:
> you can probably use PyTables for this. Even though it's meant to
> save/load data to/from disk (in HDF5 format) as far as I understand,
> it can be used to make your task solvable - even on a 32bit system !!
> It's free (pytables.org) -- so maybe you can try it out and tell me if
> I'm right ....
You still would not be able to load a numpy array > 2 Gb. Numpy memory
model needs one contiguously addressable chunk of memory for the data,
which is limited under the 32 bits archs. This cannot be overcome in
any way AFAIK.
You may be able to save data > 2 Gb, by appending several chunks < 2
Gb to disk - maybe pytables supports this if it has large file support
(which enables to write files > 2Gb on a 32 bits system).
More information about the NumPy-Discussion