[Numpy-discussion] numpy slices limited to 32 bit values?

Glenn Tarbox, PhD glenn@tarbox....
Wed May 13 23:50:21 CDT 2009


I'm using the latest version of Sage (3.4.2) which is python 2.5 and numpy
something or other (I will do more digging presently)

I'm able to map large files and access all the elements unless I'm using
slices

so, for example:

fp = np.memmap("/mnt/hdd/data/mmap/numpy1e10.mmap", dtype='float64',
mode='r+', shape=(10000000000,))

which is 1e10 doubles if you don't wanna count the zeros

gives full access to a 75 GB memory image

But when I do:

fp[:] = 1.0
np.sum(fp)

I get 1410065408.0  as the result

Interestingly, I can do:

fp[9999999999] = 3.0

and get the proper result stored and can read it back.

So, it appears to me that slicing is limited to 32 bit values

Trying to push it a bit, I tried making my own slice

myslice = slice(1410065408, 9999999999)

and using it like
fp[myslice]=1.0

but it returns immediately having changed nothing.  The slice creation
"appears" to work in that I can get the values back out and all... but
inside numpy it seems to get thrown out.

My guess is that internally the python slice in 2.5 is 32 bit even on my 64
bit version of python / numpy.

The good news is that it looks like the hard stuff (i.e. very large mmaped
files) work... but slicing is, for some reason, limited to 32 bits.

Am I missing something?

-glenn

-- 
Glenn H. Tarbox, PhD ||  206-274-6919
http://www.tarbox.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/numpy-discussion/attachments/20090513/294f169f/attachment.html 


More information about the Numpy-discussion mailing list