[Numpy-discussion] Large array using 4 times as much memory as it should

Rick Giuly rgiuly@gmail....
Thu Oct 30 20:41:44 CDT 2008


Hello All,

I find that python is using about four times as much memory as it should 
need for arrays. This is problematic as I need to use all available 
memory for large 3D imaging datasets. Is there a way to get around this 
problem? Am I making a mistake? Is it a bug?

(I'm running windowsXP 32bit with 760M of memory "Available" according 
to the "Performance" pane of the task manager.)

versions: numpy 1.2.0 with python 2.5.2

Any help is appreciated


-Rick




**************************
Details of my testing:

Each test was run from the command line and for each test python was 
restarted.

Testing a 50M array:
a = numpy.ones((1024,1024,50), dtype=numpy.uint32)
The available memory dropped by 200M


Testing a 100M array:
a = numpy.ones((1024,1024,100), dtype=numpy.uint32)
The available memory dropped by 400M


Testing a 200M array:
a = numpy.ones((1024,1024,200), dtype=numpy.uint32)
The available memory dropped by 750M


Testing a 300M array:
a = numpy.ones((1024,1024,300), dtype=numpy.uint32)
an error occurs:
Traceback (most recent call last):
   File "<stdin>", line 1, in <module>
   File 
"o:\software\pythonxy\python\lib\site-packages\numpy\core\numeric.py", li
ne 1445, in ones
     a = empty(shape, dtype, order)
MemoryError









More information about the Numpy-discussion mailing list