[Numpy-discussion] large file and array support

Travis Oliphant oliphant at ee.byu.edu
Tue Mar 29 18:13:35 CST 2005


There are two distinct issues with regards to large arrays.

1) How do you support > 2Gb memory mapped arrays on 32 bit systems and 
other large-object arrays only a part of which are in memory at any 
given time (there is an equivalent problem for > 8 Eb (exabytes) on 64 
bit systems,  an Exabyte is 2^60 bytes or a giga-giga-byte).

2) Supporting the sequence protocol for in-memory objects on 64-bit 
systems.

Part 2 can be fixed using the recommendations Martin is making and which 
will likely happen (though it could definitely be done faster).  
Handling part 1 is more difficult.

One idea is to define some kind of "super object" that mediates between 
the large file and the in-memory portion.  In other words, the ndarray 
is an in-memory object, while the super object handles interfacing it 
with a larger structure.

Thoughts?

-Travis







More information about the Numpy-discussion mailing list