[Numpy-discussion] A memory problem: why does mmap come up in numpy.inner?

Travis E. Oliphant oliphant@enthought....
Wed Jun 4 21:10:07 CDT 2008

Dan Yamins wrote:
> I'm using python 2.5.2 on OS X, with 8 GB of ram, and a 64-bit 
> processor.  In
> this, setting, I'm working with large arrays of binary data.  E.g, I 
> want to
> make calls like:
>                Z = numpy.inner(a,b)
> where and b are fairly large  -- e.g. 20000 rows by 100 columns.

Hey Dan.   Now, that you mention you are using OS X, I'm fairly 
confident that the problem is that you are using a 32-bit version of 
Python (i.e. you are not running in full 64-bit mode and so the 4GB 
limit applies).

The most common Python on OS X is 32-bit python.  I think a few people 
in the SAGE project have successfully built Python in 64-bit mode on OSX 
(but I don't think they have released anything yet).  You would have to 
use a 64-bit version of Python to compile NumPy if you want to access 
large memory.


More information about the Numpy-discussion mailing list