[Numpy-discussion] Huge arrays

Charles R Harris charlesr.harris@gmail....
Tue Sep 8 20:41:23 CDT 2009

On Tue, Sep 8, 2009 at 7:30 PM, Daniel Platz <
mail.to.daniel.platz@googlemail.com> wrote:

> Hi,
> I have a numpy newbie question. I want to store a huge amount of data
> in  an array. This data come from a measurement setup and I want to
> write them to disk later since there is nearly no time for this during
> the measurement. To put some numbers up: I have 2*256*2000000 int16
> numbers which I want to store. I tried
> data1 = numpy.zeros((256,2000000),dtype=int16)
> data2 = numpy.zeros((256,2000000),dtype=int16)
> This works for the first array data1. However, it returns with a
> memory error for array data2. I have read somewhere that there is a
> 2GB limit for numpy arrays on a 32 bit machine but shouldn't I still
> be below that? I use Windows XP Pro 32 bit with 3GB of RAM.
More precisely, 2GB for windows and 3GB for (non-PAE enabled) linux. The
rest of the address space is set aside for the operating system.  Note that
address space is not the same as physical memory, but it sets a limit on
what you can use, whether swap or real memory.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090908/0645e91b/attachment.html 

More information about the NumPy-Discussion mailing list