[Numpy-discussion] Memory errors

Travis Oliphant oliphant at ee.byu.edu
Thu Oct 5 18:44:05 CDT 2006

Vikalpa Jetly wrote:

>I am reading a very large array (~9000,11000) of 1 byte image values. I need
>to change values in the array that meet a certain condition so I am running
>something like:
>b = numpy.where(a>200,0,1)
>to create a new array with the changed values. However, I get a
>"MemoryError" everytime I try this. I have over 3gb of RAM on my machine
>(most of which is available). The process runs fine on smaller datasets. Is
>there a maximum array size that numpy handles? Any alternatives/workarounds?
The MemoryError is a direct result when system malloc fails.    Rather 
than use where with two scalars (you're resulting array will be int32 
and therefore 4-times larger).


b = zeros_like(a)
b[a>200] = 1

which will consume less memory.


More information about the Numpy-discussion mailing list