[SciPy-user] large arrays ( > 2 gb) in numarray?

Louis Wicker Louis.Wicker at noaa.gov
Tue Feb 8 23:53:14 CST 2005


Hello,

I am new to the list.  I  have been building a python-fortran ensemble  
Kalman filter system for assimilating radar data into convective cloud  
models.

I am running into a big problem that I thought I would ask about.

I am designing this system for the SGI Altix machines, which is a  
Linux64 box with > 15 Gb of memory.

I ran into the problem that Numeric or Numarray will not allow me to  
allocate an array larger than 2 gb.  Is this a fundamental limitation  
of numarray/Numeric, or
do I need to compile numarray/Numeric in some what to get it to use 64  
bit pointers.  For example, here is the failure mode...

 >>> a = numarray.zeros((200,200,50,12,40), typecode='f')
Traceback (most recent call last):
   File "<stdin>", line 1, in ?
   File "/usr/lib/python2.2/site-packages/numarray/numarraycore.py",  
line 1200, in zeros
     retarr = NumArray(shape=shape, type=type)
libnumarray.error: NA_updateDataPtr: error getting read buffer data ptr

any ideas?  this is running python2.3 and numarray 1.1

Thanks.

Lou Wicker
------------------------------------------------------------------------ 
----
|  Dr. Louis J. Wicker
|  Research Scientist, National Severe Storms Lab
| 1313 Halley Cir, Norman, OK 73069
| E-mail:   Louis.Wicker at nssl.noaa.gov
| HTTP:  www.nssl.noaa.gov/~lwicker
| Phone:    (405) 366-0416
| Fax:       (405) 366-0472
------------------------------------------------------------------------ 
----



More information about the SciPy-user mailing list